Dissertations / Theses on the topic 'Donnee met'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Donnee met.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Bonnamy, Sylvie. "Caractérisation des produits pétroliers lors de la pyrolyse de leur fraction lourde : étude géochimique et structurale." Orléans, 1987. http://www.theses.fr/1987ORLE2027.
Full textHoltzapffel, Thierry. "Minéraux argileux lattes : les smectites du domaine atlantique." Angers, 1986. http://www.theses.fr/1986ANGE0006.
Full textGuérin, Jacques. "Analyse statistique de donnees bacteriologiques du milieu marin par classification hierarchique." Rennes 1, 1990. http://www.theses.fr/1990REN1T087.
Full textArmillotta, Francesca <1977>. "Effetti della somministrazione di Testosterone in donne." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2012. http://amsdottorato.unibo.it/4511/.
Full textAs we know, testosterone (T) plays greater role in many different physiological functions. The role of T in women is still largely unknow. Recent data reports an important role of T in modulating female sexual responsed. AIM: The aim of our study was to investigate the effects of T in women on metabolic parameters and body composition and effects of T on vaginal histology, proliferation and innervations. METHODS: 16 ovariectomized FtM subjects recived TU 1000 mg i.m. with placebo or dutasteride. At week 0 and 54 the following measurements were performed: body composition and metabolic parameters. 16 vaginal sampler from 16 FtM subjects treated with T and 16 PrM e 16 M subjects were collected. Morfhology, glycogen content, Ki-67 proliferation, estrogen receptor, innervations, androgen receptor were evaluated. RESULTS: The administration of T in FtM determines a non-significant increase in LDL cholesterol and significant decrease in HDL cholesterol. HOMA are significantly reduced in the group TU-alone and tend to increase in group TU + D. The hematocrit increases significantly. BMI, WHR and fat mass tend to decrease. Lean body mass tend to increase. No significant changes in bone metabolism have been reported. Vaginal samples from FtM showed a loss of normal architecture of the epithelium. T administration resulted in a strong proliferation reduction. Stromal and epithelial ERα and stromal PGP 9.5 exspression was significantly decrease in FtM. ARs were detected in mucosa and stroma. In mucosa AR density decrease with age but non change with T. In stroma, AR density increase con T. CONCLUSIONS: In conclusion, no major adverse effects were reported after T administration. T administration determines changes in histomorphology and reduces proliferation of vaginal epithelium. We found AR expression in epithelium and stroma. T increase AR expression in stroma.
Rinckenbach, Thierry. "Diagenese minerale des sediments petroliferes du delta fossile de la mahakam (indonesie) : evolution mineralogique et isotopique des composants argileux et histoire thermique." Université Louis Pasteur (Strasbourg) (1971-2008), 1988. http://www.theses.fr/1988STR13117.
Full textOh, Jae-Ho. "Etude structurale de la graphitation naturelle : exemples de bassins sud-coreens." Orléans, 1987. http://www.theses.fr/1987ORLE2043.
Full textNicolae, Lerma Alexandre. "Approche analytique et étude prospective de l'aléa de submersion : de la donnée observée à la modélisation, à Carthagène des Indes, Colombie." Paris 1, 2012. http://www.theses.fr/2012PA010636.
Full textAyoub, Nadia. "Variabilite du niveau de la mer et de la circulation en mediterranee a partir de donnees altimetriques et de champs de vent : comparaison avec des simulations numeriques." Toulouse 3, 1997. http://www.theses.fr/1997TOU30254.
Full textNalpas, Thierry. "Inversion des grabens du sud de la mer du nord. Donnees de sub-surface et modelisation analogique." Rennes 1, 1994. http://www.theses.fr/1994REN10085.
Full textBertrand, Gilles. "Le conflit helléno-turc : nouvelles donnes et nouveaux acteurs dans le système postbipolaire et à l'âge de la globalisation." Paris, Institut d'études politiques, 2000. http://www.theses.fr/2000IEPP0026.
Full textLOPES, LAURENT. "Etude methodologique du traitement sismique : deconvolution directionnelle dans le domaine frequence - nombre d'onde ; traitement de donnees sismiques acquises avec un dispositif tracte pres du fond de la mer. application a des donnees reelles." Paris 6, 1997. http://www.theses.fr/1997PA066685.
Full textMarra, Elena <1979>. "Valutazione dell’impiego dei test per la genotipizzazione di HPV e l’espressione degli oncogeni virali nel follow-up di donne conizzate per lesioni cervicali di alto grado nello screening del cervico-carcinoma della Regione Emilia-Romagna." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2013. http://amsdottorato.unibo.it/5572/.
Full textObjective. To evaluate the prevalence of several HPV genotypes in patients with CIN2/3 in Emilia-Romagna, the genotype-specific HPV DNA persistence and the expression of HPV oncogenes E6/E7 during follow-up after conisation, and their role in the prediction of residual disease; to verify the applicability of new molecular diagnostic tests in cervical cancer screening. Methods. Patients with abnormal screening citology treated by conisation (T0) for histologically confirmed CIN2/3 were included. At T0 and at 6, 12, 18 and 24 months of follow-up, in addition to the Pap smear and colposcopy, research and genotyping of HPV DNA of 28 types were performed. In case of positivity to the DNA of the 5 genotypes 16, 18, 31, 33 and/or 45, we proceeded to detect HPV E6/E7 mRNA. Preliminary results. The 95.8% of 168 selected patients were HPV DNA positive at T0. In 60.9% of cases the infections were single (mostly HPV 16 and 31), in 39.1% were multiple. HPV 16 was the most frequent genotype detected (57%). The 94.3% (117/124) of patients positive for 5 genotypes of HPV DNA were mRNA positive. Of the 168 patients 38 dropped out. At 18 months (95% of patients), the persistence of HPV DNA of any genotype was 46%, of HPV DNA of 5 genotypes was 39%, with mRNA expression in 21%. We found recurrent disease (CIN2 +) in 10.8% (14/130) at 18 months. Citology was negative in 4/14 cases, HPV DNA test was positive in all cases, mRNA testing in 11/12 cases. Conclusions: HR-HPV DNA test is more sensitive than cytology, mRNA testing is more specific in identifying a recurrence. Final data will be available after the follow-up planned.
SAHRAOUI, HOUARI ABDELKRIM. "Application de la meta-modelisation a la generation des outils de conception et de mise en uvre des bases de donnees." Paris 6, 1995. http://www.theses.fr/1995PA066203.
Full textApostol, Costin. "Apports bioinformatiques et statistiques à l'identification d'inhibiteurs du récepteur MET." Thesis, Lille 2, 2010. http://www.theses.fr/2010LIL2S053.
Full textThe effect of polysaccharides on HGF-MET interaction was studied using an experimental design with several microarrays under different experimental conditions. The purpose of the analysis is the selection of the best polysaccharides, inhibitors of HGF-MET interaction. From a statistical point of view this is a classification problem. Statistical and computer processing of the obtained microarrays requires the implementation of the PASE platform with statistical analysis plug-ins for this type of data. The main feature of these statistical data is the repeated measurements: the experiment was repeated on 5 microarrays and all studied polysaccharides are replicated 3 times on each microarray. We are no longer in the classical case of globally independent data, we only have independence at inter-subjects and intra-subject levels. We propose mixed models for data normalization and representation of subjects by the empirical cumulative distribution function. The use of the Kolmogorov-Smirnov statistic appears natural in this context and we study its behavior in the classification algorithms like hierarchical classification and k-means. The choice of the number of clusters and the number of repetitions needed for a robust classification are discussed in detail. The robustness of this methodology is measured by simulations and applied to HGF-MET data. The results helped the biologists and chemists from the Institute of Biology of Lille to choose the best polysaccharides in tests conducted by them. Some of these results also confirmed the intuition of the researchers. The R scripts implementing this methodology are integrated into the platform PASE. The use of functional data analysis on such data is part of the immediate future work
Raynaut, William. "Perspectives de méta-analyse pour un environnement d'aide à la simulation et prédiction." Thesis, Toulouse 3, 2018. http://www.theses.fr/2018TOU30005/document.
Full textThe emergence of the big data phenomenon has led to increasing demands in data analysis, which most often are conducted by other domains experts with little experience in data science. We then consider this important demand in intelligent assistance to data analysis, which receives an increasing attention from the scientific community. The first takes on the subject often possessing similar shortcomings, we propose to address it through new processes of meta-analysis. No evaluation standard having yet been set in this relatively new domain, we first propose a meta-analysis evaluation framework that will allow us to test and compare the developed methods. In order to open new approaches of meta-analysis, we then consider one of its recurring issue: dataset characterization. We then propose and evaluate such a characterization, consisting in a dissimilarity between datasets making use of a precise topological description to compare them. This dissimilarity allows a new meta-analysis approach producing recommendations of complete data analysis processes, which we then evaluate on a proof of concept. We thus detail the proposed methods of meta-analysis, and the associated process of assistance to data analysis
Kane, Abdou. "Assimilation de données in situ et satellitaires dans le modèle de biogéochimie marine PISCES." Versailles-St Quentin en Yvelines, 2010. http://www.theses.fr/2010VERS0017.
Full textPhytoplankton (microscopic algae in suspension) in the ocean plays an important role in the climate since it regulates the concentration of CO2 in the atmosphere using the dissolved CO2 in surface water for photosynthesis. This biological activity induces dissolution of atmospheric CO2 in the ocean. This process called "oceanic biological pump" is is represented in numerical models of marine biogeochemistry. Coupled with a model of ocean circulation, a model of marine biogeochemistry as PISCES, used at IPSL, allows to better understand and quantify air-sea flux of CO2 and to study the role of marine biology in the climate change in coming decades. However, the species diversity of phytoplankton, the complexity of physiological processes involved and the lack of available measures require the use of coarse and uncertain parameterizations in global models of marine biogeochemistry, which severely limits the accuracy of simulations. Data assimilation which consists of combining objectively a model and observations to make the best possible compromise provides a rigorous framework to overcome these shortcomings. The objective of this thesis is to develop a method of assimilation of biogeochemical in situ and satellite data to improve the model PISCES. The variational method which consists to adjust iteratively the model parameters to minimize a distance to the observations was adopted, using the YAO software, developed at IPSL for the adjoint coding. A simultaneous assimilation of several contrasted oceanographic stations measurements (in terms biogeochemical) has been implemented to estimate 45 physiological parameters of the model PISCES. Using these optimized parameters to make long simulations (50 to 500 years) has allowed to demonstrate a significant improvement in chlorophyll concentration compared to the standard model, both for vertical profiles which resemble those measured at JGOFS stations and for global maps when compared to those provided by the SeaWiFS satellite. In the last part of this thesis, we show that it is possible to assimilate the satellite surface chlorophyll concentration rather than in situ vertical profiles, but to do this the satellite information on the concentration chlorophyll must supplement by climatological information on other important biogeochemical tracers such as nitrates and silicates. The discussion on the contribution of these new data types is the conclusion of this thesis
Salmi, Cheik. "Vers une description et une modélisation des entrées des modèles de coût mathématiques pour l'optimisation des entrepôts de données." Thesis, Chasseneuil-du-Poitou, Ecole nationale supérieure de mécanique et d'aérotechnique, 2017. http://www.theses.fr/2017ESMA0006/document.
Full textData warehouses (DW) have become a mature technology. The emphasis of the analysis requests is driven by technological change, the new programmig paradigms and ModelDriven Engineering (MDI). Before using these technological advances, the DW must be buil tand prepared for its proper operation. The construction phase bas seen massive description efforts and meta modeling to facilitate the definition of correspondence between local data sources schemas and DW schema and to reduce heterogeneity between sources. Despite its importance in all stages of the design life cycle of an DW, the operational phase and in particular its physical task, did not have the same interest in term of description and meta modeling. During this phase, mathematical cost models are used to quantify the quality of the solutions proposed. The development of these models requires collection efforts and analysis of relevant parameters. To simulate the operation of a DW, all the dimensions of a DBMS must be integrated. In this thesis, we propose to describe in detail these dimensions with meta-modeling mechanisms. Given the singularity and hierarchy between storage media, we have developed an ontology dedicated to storage media, which makes explicit their properties. The similarities between these supports motivated us to develop a hybrid cache based on flash memory. This increases the cache ability to store a large number of intermediate results shared by multiple decision-support queries. The reuse of these results will increase the overall performance of fue DBMS. Our contributions are validated with experiments using our theoretical cost models and the Oracle DBMS
Hassan, Adel. "Style and Meta-Style : another Way to Reuse Software Architecture Evolution." Thesis, Nantes, 2018. http://www.theses.fr/2018NANT4041/document.
Full textOver the last years, the size and complexity of software systems has been dramatically increased, making the evolution process more complex and consuming a great deal of resources. Consequently, software architecture is becoming an important artifact in planning and carrying out the evolution process. It can provide an overall structural view of the system without undue focus on low-level details. This view can provide a deep understanding of previous design decisions and a means of analysing and comparing alternative evolution scenarios. Therefore, software architecture evolution has gained significant importance in developing methods, techniques and tools that can help architects to plan evolution. To this end, an evolution styles approach has been introduced with the aim of capitalising on the recurrent evolution practices and of fostering their reuse In this thesis, we endeavour to tackle the challenges in software architecture evolution reuse by specifying a standard modeling framework that can conform to different evolution styles and satisfy the concerns of the different stakeholder groups. The primary contribution of this thesis is twofold. First, it introduces a meta-evolution style which specifies the core conceptual elements for software architecture evolution modeling. Second, it introduces a new methodology to develop a multi-view & multi-abstraction evolution style in order to reduce the complexity of the evolution model by breaking down an evolution style into several views, each of which covers a relevant set of aspects. The central ideas are embodied in a prototype tool in order to validate the applicability and feasibility of the proposed approaches
HUSSEIN, HASHIM. "Le statut juridique de la mer rouge et les voies d'eaux qui lui donnent acces. Introduction ou management des conflits entre les usagers." Nantes, 1989. http://www.theses.fr/1989NANT4010.
Full textThe red sea is a long, narrow body of water separating north-east africa from the arabian peninsula. It is one of the most interesting and complex sea spaces in the world. The thesis aims at evaluting and make certain observations conerning the attitudes general practice and politcs of the red sea riparain on the development of the different aspects of of the law of the sea, and to give a full analytical work on the implications of this development for the legal regimes of the red sea waterways the suez canal golfe of aqaba and the trait of bab al-mandeb)
Bouillot, Flavien. "Classification de textes : de nouvelles pondérations adaptées aux petits volumes." Thesis, Montpellier, 2015. http://www.theses.fr/2015MONTS167.
Full textEvery day, classification is omnipresent and unconscious. For example in the process of decision when faced with something (an object, an event, a person), we will instinctively think of similar elements in order to adapt our choices and behaviors. This storage in a particular category is based on past experiences and characteristics of the element. The largest and the most accurate will be experiments, the most relevant will be the decision. It is the same when we need to categorize a document based on its content. For example detect if there is a children's story or a philosophical treatise. This treatment is of course more effective if we have a large number of works of these two categories and if books had a large number of words. In this thesis we address the problem of decision making precisely when we have few learning documents and when the documents had a limited number of words. For this we propose a new approach based on new weights. It enables us to accurately determine the weight to be given to the words which compose the document.To optimize treatment, we propose a configurable approach. Five parameters make our adaptable approach, regardless of the classification given problem. Numerous experiments have been conducted on various types of documents in different languages and in different configurations. According to the corpus, they highlight that our proposal allows us to achieve superior results in comparison with the best approaches in the literature to address the problems of small dataset. The use of parameters adds complexity since it is then necessary to determine optimitales values. Detect the best settings and best algorithms is a complicated task whose difficulty is theorized through the theorem of No-Free-Lunch. We treat this second problem by proposing a new meta-classification approach based on the concepts of distance and semantic similarities. Specifically we propose new meta-features to deal in the context of classification of documents. This original approach allows us to achieve similar results with the best approaches to literature while providing additional features. In conclusion, the work presented in this manuscript has been integrated into various technical implementations, one in the Weka software, one in a industrial prototype and a third in the product of the company that funded this work
Verdoit-Jarraya, Marion. "Caractérisation et modélisation de la dynamique spatiale et saisonnière de populations démersales et benthiques exploitées de la Mer Celtique." Paris 6, 2003. http://www.theses.fr/2003PA066596.
Full textKaddes, Mourad. "Etudes des transactions plates et étendues dans les SGBD temps réels." Thesis, Le Havre, 2013. http://www.theses.fr/2013LEHA0009/document.
Full textThis thesis presents a study of flat and extended model of transactions in real-time DBMS (RTDBMSs). This study is carried in two steps : (i) the first step aims to help designers to describe and compare models of real-time transactions, (ii) the second step allows to complete this study by presenting a stochastic study of RTDBMSs performance using two models of real-time transactions : flat transactions and nested transactions models. In the first step, we introduced the meta-model « MRT-ACTA » that takes into account the transactions and data temporal characteristics and their realtime interactions. « M-RT-ACTA » allows designers defining and comparing new models of real-time transactions. The formal description of « M-RT-ACTA » validates our proposals. In order to complete this work, we have observed that transactions scheduling is an important area in RTDBMSs, so we proposed in the second step a stochastic study of RTDBMS performance. Thus, we have proposed to improve the success ratio of flat transactions with GEDF protocol (generalization of GEDF) and we have adapted this study to nested transactions
Chakroun, Chedlia. "Contribution à la définition d'une méthode de conception de bases de données à base ontologique." Phd thesis, ISAE-ENSMA Ecole Nationale Supérieure de Mécanique et d'Aérotechique - Poitiers, 2013. http://tel.archives-ouvertes.fr/tel-00904117.
Full textEl, Abed Walid. "Meta modèle sémantique et noyau informatique pour l'interrogation multilingue des bases de données en langue naturelle (théorie et application)." Besançon, 2001. http://www.theses.fr/2001BESA1014.
Full textSun-Hosoya, Lisheng. "Meta-Learning as a Markov Decision Process." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLS588/document.
Full textMachine Learning (ML) has enjoyed huge successes in recent years and an ever- growing number of real-world applications rely on it. However, designing promising algorithms for a specific problem still requires huge human effort. Automated Machine Learning (AutoML) aims at taking the human out of the loop and develop machines that generate / recommend good algorithms for a given ML tasks. AutoML is usually treated as an algorithm / hyper-parameter selection problems, existing approaches include Bayesian optimization, evolutionary algorithms as well as reinforcement learning. Among them, auto-sklearn which incorporates meta-learning techniques in their search initialization, ranks consistently well in AutoML challenges. This observation oriented my research to the Meta-Learning domain. This direction led me to develop a novel framework based on Markov Decision Processes (MDP) and reinforcement learning (RL).After a general introduction (Chapter 1), my thesis work starts with an in-depth analysis of the results of the AutoML challenge (Chapter 2). This analysis oriented my work towards meta-learning, leading me first to propose a formulation of AutoML as a recommendation problem, and ultimately to formulate a novel conceptualisation of the problem as a MDP (Chapter 3). In the MDP setting, the problem is brought back to filling up, as quickly and efficiently as possible, a meta-learning matrix S, in which lines correspond to ML tasks and columns to ML algorithms. A matrix element S(i, j) is the performance of algorithm j applied to task i. Searching efficiently for the best values in S allows us to identify quickly algorithms best suited to given tasks. In Chapter 4 the classical hyper-parameter optimization framework (HyperOpt) is first reviewed. In Chapter 5 a first meta-learning approach is introduced along the lines of our paper ActivMetaL that combines active learning and collaborative filtering techniques to predict the missing values in S. Our latest research applies RL to the MDP problem we defined to learn an efficient policy to explore S. We call this approach REVEAL and propose an analogy with a series of toy games to help visualize agents’ strategies to reveal information progressively, e.g. masked areas of images to be classified, or ship positions in a battleship game. This line of research is developed in Chapter 6. The main results of my PhD project are: 1) HP / model selection: I have explored the Freeze-Thaw method and optimized the algorithm to enter the first AutoML challenge, achieving 3rd place in the final round (Chapter 3). 2) ActivMetaL: I have designed a new algorithm for active meta-learning (ActivMetaL) and compared it with other baseline methods on real-world and artificial data. This study demonstrated that ActiveMetaL is generally able to discover the best algorithm faster than baseline methods. 3) REVEAL: I developed a new conceptualization of meta-learning as a Markov Decision Process and put it into the more general framework of REVEAL games. With a master student intern, I developed agents that learns (with reinforcement learning) to predict the next best algorithm to be tried. To develop this agent, we used surrogate toy tasks of REVEAL games. We then applied our methods to AutoML problems. The work presented in my thesis is empirical in nature. Several real world meta-datasets were used in this research. Artificial and semi-artificial meta-datasets are also used in my work. The results indicate that RL is a viable approach to this problem, although much work remains to be done to optimize algorithms to make them scale to larger meta-learning problems
Mascret, Ariane. "Développement d'une approche SIG pour l'intégration de données Terre/Mer." Phd thesis, Ecole nationale supérieure d'arts et métiers - ENSAM, 2010. http://pastel.archives-ouvertes.fr/pastel-00542500.
Full textCouffin, Florent. "Modèle de données de référence et processus de spécialisation pour l'intégration des activités de conception en génie automatique." Cachan, Ecole normale supérieure, 1997. http://www.theses.fr/1997DENS0005.
Full textLe, Berre Iwan. "Mise au point de méthodes d'analyse et de représentation des interactions complexes en milieu littoral." Brest, 1999. http://www.theses.fr/1999BRES1008.
Full textBazile, Jeanne. "Intégration de données publiques et analyses protéomiques pour révéler les mécanismes et biomarqueurs du dépôt de lipides dans la carcasse ou le muscle bovin." Thesis, Université Clermont Auvergne (2017-2020), 2019. http://www.theses.fr/2019CLFAC074.
Full textProducing meat animals with adequate muscular and adipose masses (i.e. lean-to-fat ratio) is an economic challenge for the beef industry. The lean-to-fat ratio influence weight and yield of carcasses as well as the sensorial and nutritional quality of the meat. Omics methods have been widely used to understand mechanisms underlying the variability of the adipose and muscle tissue growth in the bovine. However, it is not always easy to extract or generate a synthetic biological information from this volume of data. The objective of my thesis was to aggregate and analyse public data to propose genes or proteins related to the lean-to-fat, and to identify data to be completed by experiment. To achive this goal, experimental and in silico methods were combined.Majority of available data in public databases were muscle transcriptomic data, and very rarely proteomic data. Data from 5 publications comparing muscle proteome of bovine breeds divergent in their intramuscular lipid content were aggregated and allowed the identification of 50 differently abundant proteins. Of these, 9 were concordant in at least 2 publications. As those data were obtained only in late-maturing breeds, we analysed proteome of “Rouges des Prés” cows which deposit fat in a later stage of development. Longissimus thoracis muscle of bovine “Rouges des prés”, that diverges in their muscular and corporal adiposity were analysed by shotgun and 2DE techniques. Of the 47 proteins significatively associated to adipose depots in muscle or carcass, 21 were common to published data and 26 had never been identified before. Particularly, APOBEC2 abundance was strongly correlated with both carcass and muscle adiposity.Among the microarray data available in the public databases, 84 and 12 datasets relative to muscular and adipose growth were selected, respectively. Because of missing metadata, only 33 (32 “MT” and 1 “AT”) were used and their identifiers updated on the current bovine genome (UCD1.2; collaboration with Sigenae). Data of 32 “muscle” datasets were categorized according to the age, breed, sex or nutrition. Data were regrouped by categories and analysed by p-value combination according to the inverse normal method (collaboration with Gabi UMR). For the age category, a major factor influencing intramuscular lipid content, we identified 238 genes differentially expressed between two ages in longissimus dorsi of bovine of 5 different breeds. Among these 238 genes, 97 were identified in at least 2 datasets analysed individually and in the meta-analysis. The meta-analysis confirmed the dynamic regulation of glycolytic and oxidative metabolisms depending on bovine age. 17 genes were exclusively identified in the meta-analysis as differentially expressed between two ages. Among the identified genes, some are linked to lipid metabolism (APOE, LDLR, MXRA8) and other may induce (YBX1) or repress (MAPK14, YWAH, ERBB2) the differentiation of muscle progenitor cells towards the adipose lineage.Integration of public data, in particular by the meta-analysis, provided a global view of biological mechanisms and biomarkers (genes or proteins) of the lean-to-fat ratio that were the most frequently identified in several breeds. The relationships between the abundances of the identified molecules and adiposity criteria remain to be quantified in a perspective of biomarker validation (prior to animal slaughter/ without the need to slaughter the animal).Integration of public data, in particular by the meta-analysis, allowing to aquire a global view of biological mechanisms and biomarkers (genes or proteins) of the AT to MT ratio. This approach will allow to check and evaluate carcass and muscle AT percentage from gene or protein abundance prior to animal slaughter/ without the need to slaughter the animal
FONTAINE, CHRISTINE. "Sismicite et structure en vitesse de la bordure cotiere de la marge nord ligure a partir des donnees de la campagne a terre et en mer sisbalig ii. Hypotheses sur la formation et l'evolution actuelle de la marge." Paris 6, 1996. http://www.theses.fr/1996PA066555.
Full textORPHANIDIS, ELIE. "Conditions physicochimiques de precipitation de la barytine epigenetique dans le bassin sud-ouest de la fosse atlantis ii (mer rouge) : donnees des inclusions fluides et approche experimentale. implications pour le depot des metaux de base et des metaux precieux." Orléans, 1995. http://www.theses.fr/1995ORLE2022.
Full textAmmar, Adel. "Restitution de la salinité de surface de l'océan à partir des mesures SMOS : une approche neuronale?" Toulouse 3, 2008. http://thesesups.ups-tlse.fr/475/.
Full textUsing neural networks to retrieve the sea surface salinity from the observed Soil Moisture and Ocean Salinity (SMOS) brightness temperatures (TBs) is an empirical approach that offers the possibility of being independent from any theoretical emissivity model. We prove that this approach is applicable to all pixels over ocean, by designing a set of neural networks with different inputs. Besides, we demonstrate that a judicious distribution of the geophysical parameters in the learning database allows to markedly reduce the systematic regional biases of the retrieved SSS, which are due to the high noise on the TBs. An equalization of the distribution of the geophysical parameters, followed by a new technique for boosting the learning process, makes the regional biases almost disappear for latitudes between 40°S and 40°N, while the global standard deviation remains between 0. 6 psu (at the center of the swath) and 1 psu (at the edges)
Ben, Ticha Mohamed Bassam. "Fusion de données satellitaires pour la cartographie du potentiel éolien offshore." Phd thesis, École Nationale Supérieure des Mines de Paris, 2007. http://tel.archives-ouvertes.fr/tel-00198912.
Full textHuret, Martin. "Apports des données "couleur de l'eau" à la modélisation couplée physique -biogéochimie en milieu dynamique côtier : application au Rio de la Plata et au Golfe de Gascogne." Toulouse 3, 2005. http://www.theses.fr/2005TOU30023.
Full textMétadier, Marjolaine. "Traitement et analyse de séries chronologiques continues de turbidité pour la formulation et le test de modèles des rejets urbains par temps de pluie." Phd thesis, INSA de Lyon, 2011. http://tel.archives-ouvertes.fr/tel-00668706.
Full textCastruccio, Frédéric. "Apports des données gravimétriques GRACE pour l'assimilation de données altimétriques et in-situ dans un modèle de l'Océan Pacifique Tropical." Phd thesis, Université Joseph Fourier (Grenoble), 2006. http://tel.archives-ouvertes.fr/tel-00138506.
Full textLes récentes avancées de notre connaissance du géoïde nous ont conduit à étudier l'impact de l'utilisation d'un signal altimétrique absolu. Un modèle (OPA) de l'Océan Pacifique Tropical, où des observations in-situ et quasi-synoptiques sont disponibles (réseau TAO), et un filtre de Kalman en rang réduit (SEEK) ont été mis au point et utilisés dans différentes configurations. La première suppose une situation pré-GRACE et utilise une MDT artificielle. La deuxième utilise une MDT observée déduite du géoïde GRACE. Conjointement à l'altimétrie, les profils de température TAO sont assimilés.
Ce travail montre l'importance d'une bonne référence pour les résidus altimétriques. Le résultat le plus important concerne la capacité du système d'assimilation utilisant GRACE à mieux composer avec des données mixtes: satellites et in-situ. Ici, l'assimilation conjointe d'altimétrie et de données TAO est plus performante grâce à la meilleure compatibilité des données.
En outre, une analyse physique, qui considère l'apport de l'assimilation à l'amélioration de la représentation de la dynamique du Pacifique Tropical, a été conduite. L'originalité de ce travail est de montrer comment l'assimilation contribue à améliorer notre compréhension des mécanismes physiques en action dans ce bassin.
De manière intéressante et rétrospective, cette analyse révèle aussi une zone (8°N) où les données GRACE semblent avoir des faiblesses qu'il serait judicieux de corriger.
Meddis, Alessandra. "Inference and validation of prognostic marker for correlated survival data with application to cancer." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASR005.
Full textClustered data often arises in medical research. These are characterized by correlations between observations belonging to the same cluster. Here, we discuss some extension to clustered data in different contexts: evaluating the performance of a candidate biomarker, and assessing the treatment effect in an individual patient data (IPD) meta-analysis with competing risks. The former was motivated by the IMENEO study, an IPD meta-analysis where the prognostic validity of the Circulating Tumor Cells (CTCs) was of interest. Our objective was to determine how well CTCs discriminates patients that died from the one that did not within the t-years, comparing individuals with same tumor stage. Although the covariate-specific time dependent ROC curve has been widely used for biomarker's discrimination, there is no methodology that can handle clusteres censored data. We proposed an estimator for the covariate-specific time dependent ROC curves and area under the ROC curve when clustered failure times are detected. We considered a shared frailty model for modeling the effect of the covariates and the biomarker on the outcome in order to account for the cluster effect. A simulation study was conducted and it showed negligible bias for the proposed estimator and a nonparametric one based on inverse probability censoring weighting, while a semiparametric estimator, ignoring the clustering, is markedly biased.We further considered an IPD meta-analysis with competing risks to assess the benefit of the addition of chemotherapy to radiotherapy on each competing endpoint for patients with nasopharyngeal carcinoma. Recommendations for the analysis of competing risks in the context of randomized clinical trials are well established. Surprisingly, no formal guidelines have been yet proposed to conduct an IPD meta-analysis with competing risk endpoints. To fill this gap, this work detailed: how to handle the heterogeneity between trials via a stratified regression model for competing risks and it highlights that the usual metrics of inconsistency to assess heterogeneity can readily be employed. The typical issues that arise with meta-analyses and the advantages due to the availability of patient-level characteristics were underlined. We proposed a landmark approach for the cumulative incidence function to investigate the impact of follow up on the treatment effect.The assumption of non informative cluster size was made in both the analyses. The cluster size is said to be informative when the outcome depends on the size of the cluster conditional on a set of covariates. Intuitively, a meta-analysis would meet this assumption. However, non informative cluster size is commonly assumed even though it may be not true in some situations and it leads to incorrect results. Informative cluster size (ICS) is a challenging problem and its presence has an impact on the choice of the correct methodology. We discussed more in details interpretation of results and which quantities can be estimated under which conditions. We proposed a test for ICS with censored clustered data. To our knowledge, this is the first test on the context of survival analysis. A simulation study was conducted to assess the power of the test and some illustrative examples were provided.The implementation of each of these developments are available at https://github.com/AMeddis
Ablin, Pierre. "Exploration of multivariate EEG /MEG signals using non-stationary models." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLT051.
Full textIndependent Component Analysis (ICA) models a set of signals as linear combinations of independent sources. This analysis method plays a key role in electroencephalography (EEG) and magnetoencephalography (MEG) signal processing. Applied on such signals, it allows to isolate interesting brain sources, locate them, and separate them from artifacts. ICA belongs to the toolbox of many neuroscientists, and is a part of the processing pipeline of many research articles. Yet, the most widely used algorithms date back to the 90's. They are often quite slow, and stick to the standard ICA model, without more advanced features.The goal of this thesis is to develop practical ICA algorithms to help neuroscientists. We follow two axes. The first one is that of speed. We consider the optimization problems solved by two of the most widely used ICA algorithms by practitioners: Infomax and FastICA. We develop a novel technique based on preconditioning the L-BFGS algorithm with Hessian approximation. The resulting algorithm, Picard, is tailored for real data applications, where the independence assumption is never entirely true. On M/EEG data, it converges faster than the `historical' implementations.Another possibility to accelerate ICA is to use incremental methods, which process a few samples at a time instead of the whole dataset. Such methods have gained huge interest in the last years due to their ability to scale well to very large datasets. We propose an incremental algorithm for ICA, with important descent guarantees. As a consequence, the proposed algorithm is simple to use and does not have a critical and hard to tune parameter like a learning rate.In a second axis, we propose to incorporate noise in the ICA model. Such a model is notoriously hard to fit under the standard non-Gaussian hypothesis of ICA, and would render estimation extremely long. Instead, we rely on a spectral diversity assumption, which leads to a practical algorithm, SMICA. The noise model opens the door to new possibilities, like finer estimation of the sources, and use of ICA as a statistically sound dimension reduction technique. Thorough experiments on M/EEG datasets demonstrate the usefulness of this approach.All algorithms developed in this thesis are open-sourced and available online. The Picard algorithm is included in the largest M/EEG processing Python library, MNE and Matlab library, EEGlab
Chriki, Sghaïer. "Méta-analyses des caractéristiques musculaires afin de prédire la tendreté de la viande bovine." Phd thesis, Université Blaise Pascal - Clermont-Ferrand II, 2013. http://tel.archives-ouvertes.fr/tel-00881204.
Full textThual, Sulian. "Modèles réduits pour l'étude des mécanismes et de la modulation de l'oscillation australe El Niño." Toulouse 3, 2012. http://thesesups.ups-tlse.fr/1821/.
Full textThe El Niño Southern Oscillation (ENSO) is the most intense climatic fluctuation on Earth after the seasonal cycle. The observation, understanding and forecast of this fluctuation with worldwide impacts are major scientific issues. This thesis (entitled "Simple Models for Understanding the Mechanisms and Modulation of ENSO") documents various aspects of ENSO such as its mechanisms, its modulation and its forecast. Those various aspects are tackled by developing a hierarchy of models of the equatorial Pacific, of increasing complexity, ranging from conceptual models to a data assimilation method in an intermediate complexity model. We study at first mechanisms of ENSO formation. We develop an alternative derivation of the recharge/discharge conceptual model where ENSO arises from a basin-wide adjustment of the equatorial thermocline. We also implement an original diagnostic in a model of equatorial coupled instabilities, which evidences a new mechanism of ENSO formation where reflections at the ocean boundaries are secondary. The background ocean stratification contributes to the decadal modulation of ENSO characteristics. This relation is adressed in a new reduced model that takes into account the gravest baroclinic modes of a continusouly stratified ocean. The space of model solutions is explored, indicating a control on ENSO stability by characteristics of the equatorial thermocline. The sensitivity to stratification over the recent decades is put in perspective with the sensitivity to thermodynamic and atmospheric feedbacks. We stress in particular certain limitations of usual methods of estimation of the thermocline feedback in the central Pacific. Finally, we implement an Ensemble Kalman Filter method in an already existing intermediate model of the equatorial Pacific, in order to assimilate sea level observations and to initialize retrospective forecasts. We show that the major model constraint is on the basin modes that are associated to the recharge/discharge process of the equatorial Pacific. Our work provides a formalism to diagnose the modulation of ENSO characteristics in observations, climate projections and forecasts. Results support the need to extent the understanding of ENSO mechanisms, in order to account for the diversity of observed regimes and to improve forecasts
Ben, Khedher Anis. "Amélioration de la qualité des données produits échangées entre l'ingénierie et la production à travers l'intégration de systèmes d'information dédiés." Thesis, Lyon 2, 2012. http://www.theses.fr/2012LYO20012.
Full textThe research work contributes to improve the quality of data exchanged between the production and the engineering units which dedicated to product design and production system design. This improvement is qualified by studying the interactions between the product life cycle management and the production management. These two concepts are supported, wholly or partly by industrial information systems, the study of the interactions then lead to the integration of information systems (PLM, ERP and MES).In a highly competitive environment and globalization, companies are forced to innovate and reduce costs, especially the production costs. Facing with these challenges, the volume and frequency change of production data are increasing due to the steady reduction of the lifetime and the products marketing, the increasing of product customization and the generalization of continuous improvement in production. Consequently, the need to formalize and manage all production data is required. These data should be provided to the production operators and machines.After analysis the data quality for each existing architecture demonstrating the inability to address this problem, an architecture, based on the integration of three information systems involved in the production (PLM, ERP and MES) has been proposed. This architecture leads to two complementary sub-problems. The first one is the development of an architecture based on Web services to improve the accessibility, safety and completeness of data exchanged. The second is the integration architecture of integration based on ontologies to offer the integration mechanisms based on the semantics in order to ensure the correct interpretation of the data exchanged. Therefore, the model of the software tool supports the proposed solution and ensures that integration of data exchanged between engineering and production was carried out
Hajj-Hassan, Hicham. "Les bases de données environnementales : entre complexité et simplification : mutualisation et intégration d’outils partagés et adaptés à l’observatoire O-LiFE." Thesis, Montpellier, 2016. http://www.theses.fr/2016MONTT135.
Full textO-LiFE is an environmental observatory dedicated to the study of resources and biodiversity in the critical area of life, focused on the Mediterranean. It is also a structure at the interface between basic research and the holders of issues. This platform initiated in collaboration between Lebanese and French teams focuses first on systemic observation of the natural environment around the themes of water, biodiversity and environmental management. The foundation of the observatory is the implementation of a transdisciplinary approach to the challenge of global change.Organize, share, sustain and enhance environmental data is a priority objective to enable the wider community to converge towards a truly systemic and transdisciplinary approach to environmental issues in the Mediterranean. The construction of an information system allowing complete connection of data is therefore a priority. However, this implementation is complicated by a number of challenges to meet the end users and data producers expectations who do not share the same needs, and to take into account thenatural heterogeneity of data.In this PhD, we detail brainstorming and work needed for the development of the architecture of the information system of the observatory. The work was initiated by a survey to better understand theexisting sources of data. We then proposed to use observational data management environments based on shared ontologies and the recommendations of recognized consortia (OGC). Extensions are proposed to enable the inclusion of different perspectives on data through multi-mapping. This extension allows a decouplingbetween the original vision of the data producer and the many possible uses of the data with crossbreeding with other data sources and / or other views.We then applied the methodology on the O-LiFE data and were able to extract inter-data analysis (between two distinct data sources) and intra-bases analysis (by juxtaposing different points of view on the same data source). This work is a demonstration of the fundamental role of IS tools and observatories in the essential gathering of the scientific communities as much as stakeholders to resolve major environmental challenges facing society, particularly in Mediterranean
Semaan, Bernard. "Raffinement de la localisation d’images provenant de sites participatifs pour la mise à jour de SIG urbain." Thesis, Ecole centrale de Nantes, 2018. http://www.theses.fr/2018ECDN0055/document.
Full textCities are active spots in the earth globe. They are in constant change. New building constructions, demolitions and business changes may apply on daily basis. City managers aim to keep as much as possible an updated digital model of the city. The model may consist of 2D maps but may also be a 3D reconstruction or a street imagery sequence. In order to share the geographical information and keep a 2D map updated, collaborative cartography was born. "OpenStreetMap.org" platform is one of the most known platforms in this field. In order to create an active collaborative database of street imagery we suggest using 2D images available on image sharing platforms like "Flickr", "Twitter", etc. Images downloaded from such platforms feature a rough localization and no orientation information. We propose a system that helps finding a better localization of the images and providing an information about the camera orientation they were shot with. The system uses both visual and semantic information existing in a single image. To do that, we present a fully automatic processing chain composed of three main layers: Data retrieval and preprocessing layer, Features extraction layer, Decision Making layer. We then present the whole system results combining both semantic and visual information processing results. We call our system Data Gathering system for image Pose Estimation (DGPE). We also present a new automatic method for simple architecture building detection we have developed and used in our system. This method is based on segments detected in the image and was called Segments Based Building Detection (SBBD). We test our method against some weather changes and occlusion problems. We finally compare our building detection results with another state-of-the-art method using several images databases
Al-Kutby, Sahar. "Applications of spice extracts and other hurdles to improve microbial safety and shelf-life of cooked, high fat meat products (doner kebab)." Thesis, University of Plymouth, 2012. http://hdl.handle.net/10026.1/1184.
Full textRICHARD, JOEL. "Application de methodes de traitements numeriques de signaux a la detection, compression et reconnaissance d'evenements d'origines sismiques dans une station autonome de type sismographe fond de mer." Rennes 1, 1988. http://www.theses.fr/1988REN10121.
Full textVermeulen, Mathieu. "Une approche meta-design des learning games pour développer leur usage." Electronic Thesis or Diss., Sorbonne université, 2018. http://www.theses.fr/2018SORUS093.
Full textThis thesis in computer science is in the field of Technology Enhanced Learning (TEL) and more specifically in the field of Learning Games (LG), serious games dedicated to learning. It deals with their design, tools and models to facilitate it, and their use. To tackle this problem, we use the meta-design, an approach aiming to strongly involve the end users in the design stage but also in the use stage. To implement this approach with teachers, whom we consider as end users of LG, we propose different iterations of a simple and representable LG model to facilitate the collaborative design of these TELs, but also their reengineering. After a first iteration, the second iteration propose a model named DISC and the associated design method. They were tested in the co-design of a learning game used by teachers of higher education in the context of a MOOC and as an additional activity of a course. To involve teachers in the use stage, we propose to articulate this model with a learners’ traces visualizations tool to detect problematic pattern and, thus, facilitate the reengineering process of LG, the visualizations allowing traces analysis collected during the use stage. To carry out this research work, we chose to work with the THEDRE method which proposes an iterative research cycle supported by the feedback of indicators evaluating the process throughout the method. This continuous improvement, supported by the experiments, allow us to validate our propositions about the meta-design for the learning games
Blanchard, Pierre. "Méta-analyses sur données individuelles d’essais randomisés dans les cancers des voies aéro-digestives supérieures. Développements méthodologiques et cliniques." Thesis, Paris 11, 2013. http://www.theses.fr/2013PA11T065/document.
Full textHead and neck cancers represent the fifth cause of death from cancer in France. They are often diagnosed at an advanced stage. The poor prognosis of these diseases has led to the introduction of intensified treatments. Numerous randomized trials have evaluated the benefits of the addition of chemotherapy to locoregional treatment and of the modification of radiotherapy fractionation. The results of these trials have been synthesized in two individual patient data meta-analyses coordinated by the Meta-Analysis Unit of Gustave Roussy Cancer Center. However these meta-analyses bring up clinical and methodological questions, some of which are dealt with in this thesis. First we have studied by different means the interaction between patient level covariate, tumor site and treatment effect. We have also adapted the methodology of network meta-analyses to survival data to perform a global analysis of the entire meta-analysis database, and to rank treatments according to their efficacy, including some treatments that had not been directly compared. Some of these results were eventually confirmed by subsequently published randomized trials. We have reviewed the advantages and limits of network meta-analysis. We have also launched the update of all these meta-analyses in order to produce results consistent with actual clinical practice, update patient follow-up, and collect additional data regarding treatment efficacy, toxicity and compliance. The final results of the taxane induction meta-analysis are presented in this manuscript
Luong, Ngoc-Du. "Mieux comprendre l’altération microbiologique de saucisses fraîches au travers du prisme de la modélisation." Thesis, Nantes, Ecole nationale vétérinaire, 2020. http://www.theses.fr/2020ONIR150F.
Full textThe main cause of spoilage in fresh meat products is associated with the development of bacteria during storage. Understanding the link between bacterial activities and spoilage evolution in meat produced with preservations strategies may be difficult because of microbiota complexity and requires appropriate statistical analyses for integrating experimental data. This work aimed to develop innovative modelling tools in order to better understand microbiological spoilage in fresh poultry and pork sausages by integrating experimental data of microbiota, volatilome and sensorial profiles. Four different models were developed. A mixed-effect model enabled to study effects of lactate formulation and modified atmosphere packaging. A Bayesian model enabled to describe product pH over time. Three regression approaches were developed to evaluate the link initial microbiota and spoilage dynamics. At last, a multi-block approach was used to identify the causality link between microbiota, volatilome and sensorial profiles. These models confirmed the dynamical nature of all considered spoilage-related responses. Lactate formulation and MAP revealed to have different effects on spoilage depending on the meat type and the studied response. Specific bacterial groups have been found to be responsible for several volatile compounds responsible producing off-odours. The developed models have several advantages of considering potential biological variability sources and of better analysing complex datasets. Progress towards predicting spoilage from microbiological data will require further exploration of these models suitable for processing complex data
Jousset, Solène. "Vers l'assimilation de données estimées par radar Haute Fréquence en mer macrotidale." Thesis, Brest, 2016. http://www.theses.fr/2016BRES0029/document.
Full textThe Iroise Sea has been observed since 2006 by High Frequency (HF) radars, which estimate surface currents. These measurements offer high resolution and high frequency to capture the dynamics of the coastal domain. This thesis aims at designing and applying a method of assimilation of these data in a realistic numerical model to optimize the bottom friction and to correct the model state in order to improve the representation of the residual tidal circulation and the positions of the Ushant fronts in the Iroise Sea. The method of data assimilation used is the Ensemble Kalman Filter. The originality of this method is the use of a stochastic modeling to estimate the model error. First, ensemble simulations were carried out from the perturbation of various model parameters which are the model error sources: meteorological forcing, bottom friction, horizontal turbulent closure and surface roughness. These ensembles have been explored in terms of dispersion and correlation. An Ensemble Kalman smoother was used to optimize the bottom friction (z0) from the surface current data and from an ensemble produced from a perturbed and spatialized z0. The method is tested with a twin experiment and then with real observations. The optimized maps of parameter z0, produced with the real currents, were used in the model over another period and the results were compared with independent observations. Finally, twin experiments were conducted to test the model state correction. Two approaches were compared; first, only the low frequency, by filtering the tide in the data and in the model, is used to perform the analysis. The other approach takes the whole signal into account. With these experiments, we assess the filter's ability to control both the observed part of the state vector (currents) and the unobserved part of the system (Sea surface Temperature)
Nalpas, Thierry. "Inversion des grabens du sud de la mer du Nord. Données de sub-surface et modélisation analogique." Phd thesis, Université Rennes 1, 1994. http://tel.archives-ouvertes.fr/tel-00656044.
Full text