Dissertations / Theses on the topic 'Régression sur la médiane'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Régression sur la médiane.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Coudin, Élise. "Inférence exacte et non paramétrique dans les modèles de régression et les modèles structurels en présence d'hétéroscédasticité de forme arbitraire." Thèse, Paris, EHESS, 2007. http://hdl.handle.net/1866/1506.
Full textGrollemund, Paul-Marie. "Régression linéaire bayésienne sur données fonctionnelles." Thesis, Montpellier, 2017. http://www.theses.fr/2017MONTS045.
Full textThe linear regression model is a common tool for a statistician. If a covariable is a curve, we tackle a high-dimensional issue. In this case, sparse models lead to successful inference, for instance by expanding the functional covariate on a smaller dimensional space.In this thesis, we propose a Bayesian approach, named Bliss, to fit the functional linear regression model. The Bliss model supposes, through the prior, that the coefficient function is a step function. From the posterior, we propose several estimators to be used depending on the context: an estimator of the support and two estimators of the coefficient function: a smooth one and a stewpise one. To illustrate this, we explain the black Périgord truffle yield with the rainfall during the truffle life cycle. The Bliss method succeeds in selecting two relevant periods for truffle development.As another feature of the Bayesian paradigm, the prior distribution enables the integration of preliminary judgments in the statistical inference. For instance, the biologists’ knowledge about the truffles growth is relevant to inform the Bliss model. To this end, we propose two modifications of the Bliss model to take into account preliminary judgments. First, we indirectly collect preliminary judgments using pseudo data provided by experts. The prior distribution proposed corresponds to the posterior distribution given the experts’ pseudo data. Futhermore, the effect of each expert and their correlations are controlled with weighting. Secondly, we collect experts’ judgments about the most influential periods effecting the truffle yield and if the effect is positive or negative. The prior distribution proposed relies on a penalization of coefficient functions which do not conform to these judgments.Lastly, the asymptotic behavior of the Bliss method is studied. We validate the proposed approach by showing the posterior consistency of the Bliss model. Using model-specific assumptions, efficient proof of the Wald theorem is given. The main difficulty is the misspecification of the model since the true coefficient function is surely not a step function. We show that the posterior distribution contracts on a step function which is the Kullback-Leibler projection of the true coefficient function on a set of step functions. This step function is derived from the true parameter and the design
Ferchichi, Seifeddine. "Nouvelle approche de calcul de l'axe de médiane d'objets basée sur le clustering." Mémoire, Université de Sherbrooke, 2004. http://savoirs.usherbrooke.ca/handle/11143/4591.
Full textSidi, Zakari Ibrahim. "Sélection de variables et régression sur les quantiles." Thesis, Lille 1, 2013. http://www.theses.fr/2013LIL10081/document.
Full textThis work is a contribution to the selection of statistical models and more specifically in the selection of variables in penalized linear quantile regression when the dimension is high. It focuses on two points in the selection process: the stability of selection and the inclusion of variables by grouping effect. As a first contribution, we propose a transition from the penalized least squares regression to quantiles regression (QR). A bootstrap approach based on frequency of selection of each variable is proposed for the construction of linear models (LM). In most cases, the QR approach provides more significant coefficients. A second contribution is to adapt some algorithms of "Random" LASSO (Least Absolute Shrinkage and Solution Operator) family in connection with the QR and to propose methods of selection stability. Examples from food security illustrate the obtained results. As part of the penalized QR in high dimension, the grouping effect property is established under weak conditions and the oracle ones. Two examples of real and simulated data illustrate the regularization paths of the proposed algorithms. The last contribution deals with variable selection for generalized linear models (GLM) using the nonconcave penalized likelihood. We propose an algorithm to maximize the penalized likelihood for a broad class of non-convex penalty functions. The convergence property of the algorithm and the oracle one of the estimator obtained after an iteration have been established. Simulations and an application to real data are also presented
Delsol, Laurent. "Régression sur variable fonctionnelle : estimation, tests de structure et applications." Phd thesis, Université Paul Sabatier - Toulouse III, 2008. http://tel.archives-ouvertes.fr/tel-00449806.
Full textARFI, MOUNIR. "Sur la régression non paramétrique d'un processus stationnaire mélangeant ou ergodique." Paris 6, 1996. http://www.theses.fr/1996PA066012.
Full textQannari, Abdellah. "Approximations de matrices et régression linéaire sur des prédicteurs quasi-colinéaires." Rennes 2, 1993. http://www.theses.fr/1993REN20024.
Full textSahaly, Ridha. "Effet de la consigne sur les indices mécaniques et électromyographiques de la contraction musculaire isométrique." Paris 6, 2004. http://www.theses.fr/2004PA066294.
Full textVincent, Pierre-Luc. "Tests de régression dans les systèmes orientés objet : une approche basée sur les modèles." Thèse, Université du Québec à Trois-Rivières, 2009. http://depot-e.uqtr.ca/1976/1/030131509.pdf.
Full textLaloë, Thomas. "Sur quelques problèmes d'apprentissage supervisé et non supervisé." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2009. http://tel.archives-ouvertes.fr/tel-00455528.
Full textEscaffre, Lionel. "Contribution à l'analyse des déterminants de l'offre d'information sur le capital intellectuel." Paris 9, 2002. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=2002PA090026.
Full textVigneault, Thomas. "Analyse de l'efficacité d'action emploi par régression discontinue." Thesis, Université Laval, 2013. http://www.theses.ulaval.ca/2013/29419/29419.pdf.
Full textTaupin, Marie-Luce. "Estimation semi-paramétrique pour le modèle de régression non linéaire avec erreurs sur les variables." Paris 11, 1998. http://www.theses.fr/1998PA112004.
Full textColliez, Johan. "Estimation robuste du mouvement dans les séquences d'images basée sur la régression par vecteurs supports." Littoral, 2009. http://www.theses.fr/2009DUNK0231.
Full textOne of the main tasks in computer vision is to extract the relevant information from an images sequence by using the regression and model adjustment theory. However, the presence of noise and ouliers has the effect of altering the task of estimating the structure of the underlying model. Hence, the need of using robust estimators against the errors inherent to natural scenes images. In this work, we propose a new robust estimator based on support vectors machines. This estimator is a weighted version of regression by support vectors. It assigns a heterogeneous penality to observations according that they belong to inliers or outliers classes. Hard and soft penalisations were considered and an iterative approach was applied to extract the dominant structure in the data set. The many simulated sets indicate that the proposed robust estimator by support vectors has a breakdown point above 50% imroving significantly the performance of the standard regression by support vector. Moreover, it permits to extract the dominant structure in the data set with a high resistance to residual structures. The robust regression approach was applied to estimate the movement in images sequence by optic flow as well as by images matching
Fréchette, Nicolas. "Test de régression dans les systèmes orientés objet : une approche statique basée sur le code." Thèse, Université du Québec à Trois-Rivières, 2010. http://depot-e.uqtr.ca/1972/1/030149018.pdf.
Full textVandal, Nathalie. "La régression non paramétrique multidimensionnelle. Théorie et application à une étude portant sur la densité mammaire." Thesis, Université Laval, 2005. http://www.theses.ulaval.ca/2005/23252/23252.pdf.
Full textInscrite au Tableau d'honneur de la Faculté des études supérieures
Coq, Guilhelm. "Utilisation d'approches probabilistes basées sur les critères entropiques pour la recherche d'information sur supports multimédia." Poitiers, 2008. http://theses.edel.univ-poitiers.fr/theses/2008/Coq-Guilhelm/2008-Coq-Guilhelm-These.pdf.
Full textModel selection problems appear frequently in a wide array of applicative domains such as data compression and signal or image processing. One of the most used tools to solve those problems is a real quantity to be minimized called information criterion or penalized likehood criterion. The principal purpose of this thesis is to justify the use of such a criterion responding to a given model selection problem, typically set in a signal processing context. The sought justification must have a strong mathematical background. To this end, we study the classical problem of the determination of the order of an autoregression. . We also work on Gaussian regression allowing to extract principal harmonics out of a noised signal. In those two settings we give a criterion the use of which is justified by the minimization of the cost resulting from the estimation. Multiple Markov chains modelize most of discrete signals such as letter sequences or grey scale images. We consider the determination of the order of such a chain. In the continuity we study the problem, a priori distant, of the estimation of an unknown density by an histogram. For those two domains, we justify the use of a criterion by coding notions to which we apply a simple form of the “Minimum Description Length” principle. Throughout those application domains, we present alternative methods of use of information criteria. Those methods, called comparative, present a smaller complexity of use than usual methods but allow nevertheless a precise description of the model
Langlet, Fanny. "Etude de l'interface sang-noyau arqué hypothalamique au cours d'un déséquilibre énergétique : plasticité de l'éminence médiane et impact sur la régulation de la prise alimentaire." Phd thesis, Université du Droit et de la Santé - Lille II, 2013. http://tel.archives-ouvertes.fr/tel-00965922.
Full textBoularan, Joe͏̈l. "Deux modèles de régression. Etude théorique et exemples : [thèse en partie soutenue sur un ensemble de travaux]." Toulouse 3, 1993. http://www.theses.fr/1993TOU30225.
Full textLy, Boucar, and Boucar Ly. "Simulations Monte Carlo et tests de score sur les matrices nulles : approche par inférence exacte." Master's thesis, Université Laval, 2019. http://hdl.handle.net/20.500.11794/37854.
Full textCe document propose des outils de simulation de matrices nulles basés sur la loi conditionnelle d’une matrice de présence-absence sachant ses statistiques exhaustives. Ces outils sont basés sur la régression logistique et de plus, ils tiennent compte de l’hétérogénéité des sites et aussi de l’interaction qui peut exister entre les variables qui définissent cette hétérogénéité. Dans ce travail, nous avons traité le cas où les variables qui caractérisent l’hétérogénéité des sites sont binaires et elles sont au plus au nombre de deux. Ainsi, deux outils ont été mis en place à savoir l’algorithme basé sur la régression logistique avec interaction entre les deux variables sites et celui sans interaction entre les variables sites. À partir d’une étude de simulation sur10 000 matrices de présence-absence, nous avons pu, non seulement décrire les propriétés des algorithmes mis en place, mais aussi comparer ces derniers avec d’autres algorithmes de simulation de matrices nulles. Ces comparaisons ont permis de constater que les tests scores avec les algorithmes basés sur la régression logistique avec ou sans interaction entre lesvariables sites donnent des résultats acceptables peu importe l’impact des variables sites. En revanche, l’algorithme ’fixed-fixed’, lorsque les variables sites ont des effets alternés, devient vulnérable aux erreurs de type I. Avec l’algorithme basé sur le modèle d’indépendance, les résultats obtenus ne sont pas fiables parce que le test est très vulnérable aux erreurs de type I.Pour l’algorithme de Peres-Neto, le test de score est très conservateur mais celui-ci s’améliore avec les variables sites à effets alternés. Pour finir, ces différents algorithmes ont été utiliséspour simuler des matrices nulles à partir d’un jeu de données réelles. Cela nous a permis decomparer la structure des matrices simulées par les différents algorithmes par rapport à celle de la matrice observée.
This document proposes tools of simulation of null matrices based on the conditional law of a presence-absence matrix knowing its sufficient statistics. These tools are based on logistic regression and, moreover, they take into account the heterogeneity of the sites and also the interaction that can exist between the variables that define this heterogeneity. In this work, we have treated the case where the variables that characterize the heterogeneity of the sites are binary and there are more than two. Thus, two tools have been put in place, namely the logistic regression algorithm with interaction between the two site variables and the one without interaction between the site variables. From a simulation study on10 000 presence-absence matrices, we were able not only to describe the properties of the implemented algorithms, but also to compare these algorithms with other null matrix simulation algorithms. These comparisons showed that the score tests with the logistic regression based algorithms with or without interaction between the site variables give acceptable results regardless of the impactof the site variables. On the other hand, the ’fixed-fixed’ algorithm, when the site variables have alternate effects, becomes vulnerable to type I errors. With the algorithm based on the independence model, the results obtained are not reliable because the test is very vulnerable to type I errors. For the Peres-Neto algorithm, the score test is very conservative but itimproves with the alternate effect site variables. Finally, these different algorithms were used to simulate null matrices from a real dataset. This enabled us to compare the structure of the matrices simulated by the different algorithms with respect to that of the observed matrix.
This document proposes tools of simulation of null matrices based on the conditional law of a presence-absence matrix knowing its sufficient statistics. These tools are based on logistic regression and, moreover, they take into account the heterogeneity of the sites and also the interaction that can exist between the variables that define this heterogeneity. In this work, we have treated the case where the variables that characterize the heterogeneity of the sites are binary and there are more than two. Thus, two tools have been put in place, namely the logistic regression algorithm with interaction between the two site variables and the one without interaction between the site variables. From a simulation study on10 000 presence-absence matrices, we were able not only to describe the properties of the implemented algorithms, but also to compare these algorithms with other null matrix simulation algorithms. These comparisons showed that the score tests with the logistic regression based algorithms with or without interaction between the site variables give acceptable results regardless of the impactof the site variables. On the other hand, the ’fixed-fixed’ algorithm, when the site variables have alternate effects, becomes vulnerable to type I errors. With the algorithm based on the independence model, the results obtained are not reliable because the test is very vulnerable to type I errors. For the Peres-Neto algorithm, the score test is very conservative but itimproves with the alternate effect site variables. Finally, these different algorithms were used to simulate null matrices from a real dataset. This enabled us to compare the structure of the matrices simulated by the different algorithms with respect to that of the observed matrix.
Escaffre, Lionel. "CONTRIBUTION A L'ANALYSE DES DETERMINANTS DE L'OFFRE D'INFORMATION SUR LE CAPITAL INTELLECTUEL." Phd thesis, Université Paris Dauphine - Paris IX, 2002. http://tel.archives-ouvertes.fr/tel-00769320.
Full textPrévot, Vincent. "Étude sur la modulation de la sécrétion de GnRH dans la zone externe de l'éminence médiane : implication d'une plasticité stéroïdo-dépendante et rôle du monoxyde d'azote." Lille 1, 1999. https://pepite-depot.univ-lille.fr/LIBRE/Th_Num/1999/50376-1999-141.pdf.
Full textRenahy, Emilie. "Recherche d'information en matière de santé sur Internet : déterminants, pratiques et impact sur la santé et le recours aux soins." Paris 6, 2008. http://www.theses.fr/2008PA066087.
Full textMercuriali, Pierre. "Sur les systèmes de formes normales pour représenter efficacement des fonctions multivaluées." Electronic Thesis or Diss., Université de Lorraine, 2020. http://www.theses.fr/2020LORR0241.
Full textIn this document, we study efficient representations, in term of size, of a given semantic content. We first extend an equational specification of median forms from the domain of Boolean functions to that of lattice polynomials over distributive lattices, both domains that are crucial in artificial intelligence. This specification is sound and complete: it allows us to algebraically simplify median forms into median normal forms (MNF), that we define as minimal median formulas with respect to a structural ordering of expressions. We investigate related complexity issues and show that the problem of deciding if a formula is in MNF, that is, minimizing the median form of a monotone Boolean function, is in sigmaP, at the second level of the polynomial hierarchy; we show that this result holds for arbitrary Boolean functions as well. We then study other normal form systems (NFSs), thought of, more generally, as a set of stratified terms over a fixed sequence of connectives, such as (m, NOT) in the case of the MNF. For a fixed NFS A, the complexity of a Boolean function f with respect to A is the minimum of the sizes of terms in A that represent f. This induces a preordering of NFSs: an NFS A is polynomially as efficient as an NFS B if there is a polynomial P with nonnegative integer coefficients such that the complexity of any Boolean function f with respect to A is at most the value of P in the complexity of f with respect to B. We study monotonic NFSs, i.e., NFSs whose connectives are increasing or decreasing in each argument. We describe optimal monotonic NFSs, that are minimal with respect to the latter preorder. We show that they are all equivalent. We show that optimal monotonic NFSs are exactly those that use a single connective or one connective and the negation. Finally, we show that optimality does not depend on the arity of the connective
Vau, Bernard. "Algorithmes d’identification par régression pseudo-linéaire avec prédicteurs paramétrisés sur des bases généralisées de fonctions de transfert orthonormales." Thesis, Université Paris-Saclay (ComUE), 2019. http://www.theses.fr/2019SACLN062.
Full textThis thesis deals with identification of linear time invariant systems described by discrete-time transfer functions. For a given order, contrary to identification methods minimizing explicitly the prediction error variance, algorithms based on pseudo-linear regression produce models with a bias distribution dependent on the predictor parametrization. This has been demonstrated by the innovating concept of equivalent prediction error, a signal in general non-measurable, whose variance is effectively minimized by the pseudo-linear regression.In a second step, revisited versions of recursive algorithms are proposed (Output Error, extended least squares, and their equivalents in closed-loop), whose predictors are expressed on generalized bases of transfer functions introduced by Heuberger et al. in the 1990s and 2000s. The selection of the basis poles is equivalent to define the reproducing kernel of the Hilbert space associated to these functions, and to impose how approximation is achieved by the algorithms. A particular expression of this reproducing kernel is employed to introduce an indicator of the basis poles effect on the model fit in the frequency domain. This indicator plays a great role from a heuristic point of view.At last, a validation test in accordance with these algorithms is proposed. Its statistical properties are given. This set of algorithms provides to the user some simple tuning parameters (the basis poles) that can be selected in function of the implicit purpose assigned to the identification procedure. Obtaining reduced order models is made easier, while identification of stiff systems –impossible until now in discrete-time- becomes accessible
Nkengne, Nguimezong Alex Albert. "Prédire l'âge de personnes à partir de photos du visage : une étude fondée sur la caractérisation et l'analyse de signes du vieillissement." Paris 6, 2008. https://tel.archives-ouvertes.fr/tel-00812718.
Full textAge has always been an important identity attribute. In order to build an algorithm for predicting age from someone front face picture, we have study the signs of facial aging and their incidence on perceived age. Firstly we have analyzed the anatomical transformations that alter the adult face. Secondly, we have determined the features associated with aging that mostly drive the human perception of age. Finally, we have built and validated a predictive model of age from front face pictures. This model uses the Partial Least Square (PLS) regression to summarize the facial information related to aging. Thanks to this model, people age can be predicted, at the level of human accuracy
Ould, Aboubecrine Mohamed Mahmoud. "Sur l'estimation basée sur les records et la caractérisation des populations." Le Havre, 2011. http://www.theses.fr/2011LEHA0004.
Full textIn the first part of this work, we consider a number of k-record values from independent and identically distributed random variables with a continuous distribution function F, ou aim is to predict future k-record values under suitable assumptions on the tail of F. In the second part, we consider finite populations and investigate their characterization by regressions of order statistics under sampling without replacement. We also give some asymptotic results when the size of the population goes to infinity
Dabo-Niang, Sophie. "Sur l'estimation fonctionnelle en dimension infinie : application aux diffusions." Paris 6, 2002. http://www.theses.fr/2002PA066273.
Full textNkengne, Alex A. "Prédire l'âge de personnes à partir de photos du visage : une étude fondée sur la caractérisation et l'analyse de signes du vieillissement." Phd thesis, Université Pierre et Marie Curie - Paris VI, 2008. http://tel.archives-ouvertes.fr/tel-00812718.
Full textMonnier, Jean-Baptiste. "Quelques contributions en classification, régression et étude d'un problème inverse en finance." Phd thesis, Université Paris-Diderot - Paris VII, 2011. http://tel.archives-ouvertes.fr/tel-00650930.
Full textBelzile, Martin. "Analyse de survie sur les prédicteurs de la durée d’un processus thérapeutique individuel chez les hommes auteurs de violence conjugale." Thèse, Université de Sherbrooke, 2016. http://hdl.handle.net/11143/9486.
Full textBoulinguiez, Benoît. "Procédé d'adsorption et régénération électrothermique sur textile de carbone activé : une solution pour la problématique des COV dans des gaz à fort potentiel énergétique." Rennes 1, 2010. https://tel.archives-ouvertes.fr/tel-00540206.
Full textAn adsorption-electrodesorption process on activated carbon fabric is deemed to address the issue of volatile organic compounds at trace concentrations in methane-rich gases : biogas and natural gas. The experimental procedure is divided into two connected sections so as to, firstly, assess the potential of several materials and define the most relevant in the working conditions by means of modelling chemical and physical phenomena adsorption-bound and desorption-bound; secondly implement this fabric in a specific lab-scale pilot-unit, designed to perform continuous treatment. Adsorption and desorption in steady conditions of the studied organic volatile compounds: toluene, isopropanol, methylene chloride, ethanethiol, octamethylecyclotetrasiloxane et tetrahydrothiophene, on several activated carbon fabrics are characterised, modelled et quantified in order to design the lab-scale pilot-unit
Hindié, Mathilde. "Orientations fonctionnelles de cellules adhérentes (mélanomes B16 et préostéoblastes MCT3) cultivées sur un support cellulosique." Compiègne, 2004. http://www.theses.fr/2004COMP1535.
Full textBiomaterials according to their nature and their surface properties can give to cells in their contact, different morphologies which will be determinant to their functional orientations. This thesis airn is to study the reactions of cells cultivated on an original coating for cell culture. Adherent cells lines used during this study are three cells lines of murine melanoma cells B 16 with growing metastatic power and MC3T3 preosteoblasts. Principal results obtained are : a cellular morphology change which is manifested by cells aggregation ; a cellular proliferation inhibition ; a differentiation induction and/or apoptosis This coating permits us to obtain significant and reproducible results which open new perspectives for its utilization as a fundamental research tool and as a malignity diagnostic help tool
Frouin, Arthur. "Lien entre héritabilité et prédiction de phénotypes complexes chez l’humain : une approche du problème par la régression ridge sur des données de population." Thesis, université Paris-Saclay, 2020. http://www.theses.fr/2020UPASL027.
Full textThis thesis studies the contribution of machine learning methods for the prediction of complex and heritable human phenotypes, from population genetic data. Indeed, genome-wide association studies (GWAS) generally only explain a small fraction of the heritability observed in family data. However, heritability can be approximated on population data by genomic heritability, which estimates the phenotypic variance explained by the set of single nucleotide polymorphisms (SNPs) of the genome using mixed models. This thesis therefore approaches heritability from a machine learning perspective and examines the close link between mixed models and ridge regression.Our contribution is twofold. First, we propose to estimate genomic heritability using a predictive approach via ridge regression and generalized cross validation (GCV). Second, we derive simple formulas that express the precision of the ridge regression prediction as a function of the size of the population and the total number of SNPs, showing that a high heritability does not necessarily imply an accurate prediction. Heritability estimation via GCV and prediction precision formulas are validated using simulated data and real data from UK Biobank. The last part of the thesis presents results on qualitative phenotypes. These results allow a better understanding of the biases of the heritability estimation methods
Attaoui, Said. "Sur l'estimation semi paramétrique robuste pour statistique fonctionnelle." Phd thesis, Université du Littoral Côte d'Opale, 2012. http://tel.archives-ouvertes.fr/tel-00871026.
Full textAchour, Sami. "Schémas d'adaptations algorithmiques sur les nouveaux supports d'éxécution parallèles." Thesis, Grenoble, 2013. http://www.theses.fr/2013GRENM086/document.
Full textWith the multitude of emerging parallel platforms characterized by their heterogeneity in terms of hardware components (processors, networks, ...), the development of performant applications and parallel libraries have become a challenge. A method proved suitable to face this challenge is the adaptive approach which uses several parameters (architectural, algorithmic, ...) in order to optimize the execution of the application on the target platform. Applications adopting this approach must take advantage of performance modeling methods to make their choice between the alternatives they have (algorithms, implementations or scheduling). The use of these modeling approaches in adaptive applications must obey the constraints imposed by the context, namely predictions speed and accuracy. We propose in this work, first, a framework for developing adaptive parallel applications based on theoretical modeling performance. Then, we focuse on the task of performance prediction for the case of parallel and hierarchical environments. Indeed, we propose a framework combining different methods of performance modeling (analytical, experimental and simulation) to ensure a balance between the constraints raised. This framework makes use of the installing phase of the application to discover the parallel platform and the execution traces of this application in order to model the behavior of two components namely computing kernels and pt/pt communications. For the modeling of these components, we have developed several methods based on experiments and polynomial regression to provide accurate models. The resulted models will be used at runtime by our tool for performance prediction of MPI programs (MPI-PERF-SIM) to predict the behavior of the latter. The validation of the latter framework is done separately for the different modules, then globally on the matrix product kernel
Meziane-El, May Hédia. "Extension de la régression classique à des problèmes typologiques et présentation de la "méthode des tranches de densité" : une approche basée sur la percolation." Aix-Marseille 3, 1991. http://www.theses.fr/1991AIX32000.
Full textWe are about to tackle hereafter the problem of generalizing the regression concept for the purpose of taking into account the data's structural aspects. The resolutions of this matter of fact are mostly overlooked although taking them into consideration would allow a better understanding of a great number of phenomenons and would lead to the establishment of more adequate models. In addition ot the outliers, some other fundamental factors may undermine as well the quality of the models. These factors are structural in nature. Therefore our objctive is to show that from the same set of data, it is possible to search automatically one or even several resulting models. We propose, within this thesis, a method of resoulution that should be both simple and efficient: "the method of density slices", which aim is to find underlying "multimodels" when one is handling heterogeneous data. This method treis to synthesize regression and classification techniques. The non-hierarchical classification of data must be guided by the "percolation principle" and has to be realized simultaneously with the computation of the regression hyperplans. The percolation principle which aims to find the strong density points, has to be applied to slices of points and no longer to individual points. The basis of this method are discussed herewith, an algorithm and some results are presented as well
Abdallah, Mouhammed. "Vulnérabilité des ouvrages en maçonnerie à des mouvements de terrain : méthodologie d'analyse par méthodes statistiques et par plans d'expériences numériques sur les données de la ville de Joeuf." Thesis, Vandoeuvre-les-Nancy, INPL, 2009. http://www.theses.fr/2009INPL019N/document.
Full textThe context of our study concerns ground movements that may occur in Lorraine as a result of mining subsidence events and their impact on traditional masonry houses. When such an event occurs, houses suffer disorders resulting from efforts in the structure caused by the movement of the ground. The response that characterizes the state of the structure depends on the geometrical, physical and mechanical characteristics. However, the discontinuous nature of the masonry and the interactions complexity between masonry blocks makes it difficult to determine that response. The same is true about the soil-structure interaction. The purpose of this research is to study, by numerical modelling with the distinct element method, experimental design planning and response surfaces, the behaviour of masonry structures subjected to a typical mining subsidence event and to define from this study some criteria making possible the estimation of the vulnerability of all the buildings of a city. A first simplified analysis describes the principle of the used methodology which is then applied to the study of all houses of the city of Joeuf, used as a pilot site. This methodology is based on an analysis of the total length of the opened joints, which are considered as similar to cracks in the structure. Then, a typology analysis helps first to distinguish 4 groups (types) of houses which have similar characteristics. On each of these groups, the methodology is applied consistently, based on the geometrical characteristics of the houses facades and then leads to the formulation of vulnerability functions that use the technique of orthogonal regression
Touzani, Samir. "Méthodes de surface de réponse basées sur la décomposition de la variance fonctionnelle et application à l'analyse de sensibilité." Phd thesis, Université de Grenoble, 2011. http://tel.archives-ouvertes.fr/tel-00614038.
Full textBeauregard, Benjamin. "Comparaison de modèles de régression logistique utilisés pour l'analyse de données recueillies dans le cadre d'études de type cas-témoins appariés sur le déplacement animal." Thesis, Université Laval, 2013. http://www.theses.ulaval.ca/2013/29800/29800.pdf.
Full textEl, Jabri Mohammed. "Etude de l'organisation spatiale du tissu conjonctif par analyse d'images basée sur une approche multiéchelles." Phd thesis, Clermont-Ferrand 2, 2008. http://www.theses.fr/2008CLF21831.
Full textPoilleux-Milhem, Hélène. "Test de validation adaptatif dans un modèle de régression : modélisation et estimation de l'effet d'une discontinuité du couvert végétal sur la dispersion du pollen de colza." Paris 11, 2002. http://www.theses.fr/2002PA112297.
Full textThis thesis framework is the spread of genetically modified organisms in the environment. Several parametric models of the individual pollen dispersal distribution have already been proposed for homogeneous experiments (plants emitting marked pollen surrounded by the same unmarked plants). In order to predict the "genetic pollution" in an agricultural landscape, a discontinuity effect on pollen flows in a cultivated area (e. G. A road crosses a field) has to be taken into account. This effect was modelled and estimated: according to the size of the discontinuity, it may correspond to a significant acceleration of the pollen flow. Graphical diagnosis methods show that the modelling of the individual pollen dispersal distribution and of the discontinuity effect, is best fitting the data when using constant piecewise functions. Prior to using parametric models to predict genetic pollution, goodness-of-fit tools are essential. We therefore propose a goodness-of-fit test in a nonlinear Gaussian regression model, where the errors are independent and identically distributed. This test does not require any knowledge on the regression function and on the variance of the observations. It generalises the linear hypothesis tests proposed by Baraud et al (Ann. Statist. 2003, Vol. 31) to the nonlinear hypothesis. It is asymptotically of level α and a set of functions over which it is asymptotically powerful is characterized. It is rate optimal among adaptive procedures over isotropic and anisotropic Hölder classes of alternatives. It is consistent against directional alternatives that approach the null hypothesis at a rate close to the parametric rate. According to a simulation study, this test is powerful even for fixed sample sizes
Delgard, Marie Lise. "Étude des effets et du rôle des herbiers à Zostera noltii sur la biogéochimie des sédiments intertidaux." Phd thesis, Université Sciences et Technologies - Bordeaux I, 2013. http://tel.archives-ouvertes.fr/tel-00847777.
Full textBitar, Mohammad. "L'impact de la réglementation bancaire sur la stabilité et l'efficience des banques islamiques : une analyse comparée avec les banques conventionnelles." Thesis, Grenoble, 2014. http://www.theses.fr/2014GRENG017/document.
Full textThis PhD dissertation is the first attempt to examine whether banking regulations have the same impact on the stability and the efficiency of Islamic than for conventional banks. We benefit of Basel III recommendations to investigate the impact of bank capital, liquidity and leverage requirements on the stability and the efficiency of Islamic banks compared to conventional banks. A first exploratory study uses Principal Component Analysis, Logit and Probit methods, and OLS regressions and shows that Islamic banks have higher capital, liquidity, and profitability, but that they are less stable than their conventional counterparts. A second empirical study examines the stability of Islamic banks using conditional quantile regressions and proves that Islamic banks are less stable than conventional banks. It also shows that higher capital and lower leverage improve the adjusted profits of small and highly liquid Islamic banks. Liquidity is positively associated with the stability of large Islamic banks while an opposite effect is detected when small Islamic banks are examined. Finally, we study the efficiency of Islamic banks using Data Envelopment Analysis (DEA) and find that Islamic banks are more efficient than conventional banks. We also find that higher capital and liquidity requirements penalize the efficiency of small and highly liquid Islamic banks, while the opposite is true for financial leverage. These results show that concerning capital requirements for small and highly liquid Islamic banks, a possible trade-off could be found between stability and efficiency
Bougeard, Stéphanie. "Description et prédiction à partir de données structurées en plusieurs tableaux : Application en épidémiologie animale." Phd thesis, Université Rennes 2, 2007. http://tel.archives-ouvertes.fr/tel-00267595.
Full textArwin, Arwin. "Modélisation des ressources en eau et leur exploitation énergétique sur l'exemple du bassin supérieur du citarum en Indonésie." Toulouse, INPT, 1992. http://www.theses.fr/1992INPT048H.
Full textRadermecker, Anne-Sophie. "La valeur marchande du nom d'artiste. Une étude empirique sur le marché de la peinture flamande (1946-2015)." Doctoral thesis, Universite Libre de Bruxelles, 2019. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/285721.
Full textDoctorat en Histoire, art et archéologie
info:eu-repo/semantics/nonPublished
Delecroix, Michel. "Sur l'estimation et la prévision non paramétrique des processus ergodiquesCycles économiques et taches solaires." Lille 1, 1987. http://www.theses.fr/1987LIL10146.
Full textBessaoud, Faïza. "Étude de facteurs de risque classiques et alimentaires du cancer du sein sur une population cas-témoins de l'Hérault et intérêt de la méthode de régression spline logistique." Montpellier 1, 2005. http://www.theses.fr/2005MON1T019.
Full textNyock, Ilouga Samuel. "La congruence objective entre le profil structurel organisationnel et l'individu : effets sur la satisfaction au travail et l'engagement normatif envers l'organisation : utilisation du modèle de la régression polynomiale." Lille 3, 2007. http://www.theses.fr/2007LIL30001.
Full textWe cannot understand the psychological dynamics of an individual at work if we do not take into account the complex interactions between his motivations and the organisational practices. Yet, the researches done so far have not taken into account this approach. And it is what we have tried to demonstrate in the first of our work. In the second part, our goal was to show that if the existence of links between the personality of an individual and the organisation around him is established, this can serve as a starting point for the construction of the psychological climate. Taking into account not only all the interactions between the two components, but also and almost individuals' attitudes and behaviours at work. In this perspective, we have used objective congruence indices which consist mathematically in the solution of the equation of polynomial regression. These congruence indices permit to study the links between the factors couple (the organisational structural profile and the individual cultural references) and the dependent variables (satisfaction at work and the normative commitment toward the organisation) on the basis of coefficients which are given to them in the structural modelling equation. This procedure permits to examine the form of answers surface which represent the relations between the two factors and employees attitudes in three dimensions. The factors taken separately must obey to a gathering of constraints, so preserving the direct effects of each of the two independent variables, taken separately (Edwards, 1993, 1994). The main result of our work reveals that objective congruence effect, on the satisfaction at work and the normative commitment toward the organisation between the idea of the organisational improved status of involvement toward the colleagues and the individual importance attached to interpersonal solidarity, is really true. This effect is more showed on Gabonese employees than on their French counterparts. Moreover, it is noticed that levels of satisfaction and normative commitment, related to social support, increase with the congruence and decrease with the gap