Dissertations / Theses on the topic 'Bootstrap (Statistique)'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Bootstrap (Statistique).'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Chauvet, Guillaume Carbon Michel. "Méthodes de Bootstrap en population finie." Rennes : Université Rennes 2, 2008. http://tel.archives-ouvertes.fr/tel-00267689/fr.
Full textBertail, Patrice. "La méthode du Bootstrap : quelques applications et résultats théoriques." Paris 9, 1992. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1992PA090023.
Full textChauvet, Guillaume. "Méthodes de Bootstrap en population finie." Phd thesis, Rennes 2, 2007. http://tel.archives-ouvertes.fr/tel-00267689.
Full textZabalza-Mezghani, Isabelle. "Analyse statistique et planification d'expérience en ingénierie de réservoir." Pau, 2000. http://www.theses.fr/2000PAUU3009.
Full textClémençon, Stéphan. "Résumé des Travaux en Statistique et Applications des Statistiques." Habilitation à diriger des recherches, Université de Nanterre - Paris X, 2006. http://tel.archives-ouvertes.fr/tel-00138299.
Full textKersaudy, Pierric. "Modélisation statistique de l'exposition humaine aux ondes radiofréquences." Thesis, Paris Est, 2015. http://www.theses.fr/2015PESC1120/document.
Full textThe purpose of this thesis is to deal with the problem of the management and the characterization of the variability of the human exposure to radio frequency waves through the use of the numerical dosimetry. As a matter of fact, if the recent advances in the high performance computing domain led to reduce significantly the simulation duration for the evaluation of the human exposure, this computation of the specific absorption rate remains a time-consuming process. With the variability of the usage, this constraint does not allow the analysis of the influence of random input parameters on the exposure to be achieved with classical approaches such as Monte Carlo simulations. In this work, two approaches are proposed to address this problem. The first one is based on the use and the hybridization of construction methods of surrogate models in order to study the global influence of the input parameters. The second one aims at assessing efficiently the 95th-percentiles of the output distributions in a parcimonous way. It is based on the development of an adaptive and oriented methodology of design of experiments combined with the construction of surrogate models. In this manuscript, the proposed methods are compared and tested on analytical examples and then applicated to full-scale problems from the numerical dosimetry
Richardot, Philippe. "Differents outils pour la discrimination entre deux groupes." Paris 9, 1985. https://portail.bu.dauphine.fr/fileviewer/index.php?doc=1985PA090009.
Full textBen, Ishak Anis. "Sélection de variables par les machines à vecteurs supports pour la discrimination binaire et multiclasse en grande dimension." Aix-Marseille 2, 2007. http://www.theses.fr/2007AIX22067.
Full textFlachaire, Emmanuel. "Les méthodes du bootstrap et l'inférence robuste à l'hétéroscédasticité." Aix-Marseille 2, 1998. http://www.theses.fr/1998AIX24010.
Full textCiolek, Gabriela. "Bootstrap and uniform bounds for Harris Markov chains." Electronic Thesis or Diss., Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLT024.
Full textThis thesis concentrates on some extensions of empirical processes theory when the data are Markovian. More specifically, we focus on some developments of bootstrap, robustness and statistical learning theory in a Harris recurrent framework. Our approach relies on the regenerative methods that boil down to division of sample paths of the regenerative Markov chain under study into independent and identically distributed (i.i.d.) blocks of observations. These regeneration blocks correspond to path segments between random times of visits to a well-chosen set (the atom) forming a renewal sequence. In the first part of the thesis we derive uniform bootstrap central limit theorems for Harris recurrent Markov chains over uniformly bounded classes of functions. We show that the result can be generalized also to the unbounded case. We use the aforementioned results to obtain uniform bootstrap central limit theorems for Fr´echet differentiable functionals of Harris Markov chains. Propelledby vast applications, we discuss how to extend some concepts of robustness from the i.i.d. framework to a Markovian setting. In particular, we consider the case when the data are Piecewise-determinic Markov processes. Next, we propose the residual and wild bootstrap procedures for periodically autoregressive processes and show their consistency. In the second part of the thesis we establish maximal versions of Bernstein, Hoeffding and polynomial tail type concentration inequalities. We obtain the inequalities as a function of covering numbers and moments of time returns and blocks. Finally, we use those tail inequalities toderive generalization bounds for minimum volume set estimation for regenerative Markov chains
Liquet, Benoit. "Sélection de modèles semi-paramétriques." Bordeaux 2, 2002. http://www.theses.fr/2002BOR20958.
Full textLapenta, Elia. "Three Essays in Hypothesis Testing." Thesis, Toulouse 1, 2020. http://www.theses.fr/2020TOU10053.
Full textCiolek, Gabriela. "Bootstrap and uniform bounds for Harris Markov chains." Thesis, Université Paris-Saclay (ComUE), 2018. http://www.theses.fr/2018SACLT024/document.
Full textThis thesis concentrates on some extensions of empirical processes theory when the data are Markovian. More specifically, we focus on some developments of bootstrap, robustness and statistical learning theory in a Harris recurrent framework. Our approach relies on the regenerative methods that boil down to division of sample paths of the regenerative Markov chain under study into independent and identically distributed (i.i.d.) blocks of observations. These regeneration blocks correspond to path segments between random times of visits to a well-chosen set (the atom) forming a renewal sequence. In the first part of the thesis we derive uniform bootstrap central limit theorems for Harris recurrent Markov chains over uniformly bounded classes of functions. We show that the result can be generalized also to the unbounded case. We use the aforementioned results to obtain uniform bootstrap central limit theorems for Fr´echet differentiable functionals of Harris Markov chains. Propelledby vast applications, we discuss how to extend some concepts of robustness from the i.i.d. framework to a Markovian setting. In particular, we consider the case when the data are Piecewise-determinic Markov processes. Next, we propose the residual and wild bootstrap procedures for periodically autoregressive processes and show their consistency. In the second part of the thesis we establish maximal versions of Bernstein, Hoeffding and polynomial tail type concentration inequalities. We obtain the inequalities as a function of covering numbers and moments of time returns and blocks. Finally, we use those tail inequalities toderive generalization bounds for minimum volume set estimation for regenerative Markov chains
Fahmi, Ahmed. "Contribution à une théorie de l'investissement immatériel : le cas de la gestion de la qualité totale." Dijon, 1999. http://www.theses.fr/1999DIJOE013.
Full textJouini, Jamel. "Approche économétrique des modèles avec changements structurels : quelques contributions." Aix-Marseille 2, 2004. http://www.theses.fr/2004AIX24012.
Full textThis thesis explores the empirical evidence of the instability by uncovering structural breaks in economic and financial time series. In a first time, we were defined the structural change models and reviewed the most important results available in the econometric and statistical literature. Then, we were presented the main contributions of the thesis. Indeed, we were first proposed a detailed discussion of structural breaks in time series models based on parametric and nonparametric approaches. While the first approach broaches the instability problem in the time domain by uncovering structural breaks in time series, the nonparametric approach based on the Priestley evolutionary spectra, generalizing the usual definition of spectra for stationary processes, provides simultaneously the instability characteristics in the time and frequency domains by locating the unstable frequencies and the associated dates. We were proposed the use of bootstrap in connection of testing for structural breaks. The motivation for this lies in the fact that the asymptotic distribution theory of many of the break tests presented in the literature may not always be particularly useful in small-sample situations. The bootstrap turns out to be a very useful tool in that it allows surmounting efficiently these problems. The thesis was finally taken up a problem that has received plenty of attention in the recent time series literature: the relationship between structural breaks and long memory. The goal is to model the series with a process that takes into account the two concepts at the same time to investigate the effect of one on the other
Diop, Aba. "Inférence statistique dans le modèle de régression logistique avec fraction immune." Phd thesis, Université de La Rochelle, 2012. http://tel.archives-ouvertes.fr/tel-00829844.
Full textCarcreff, Joëlle. "La qualité de l'information comptable rapportée par les banques de données : étude du cas français." Lille 2, 2001. http://www.theses.fr/2001LIL20014.
Full textA current of literature about the quality of the firms'information provided by famous databases in the U. S. A. , shows that many financial or accounting variables are biased depending upon the file used. Within this framework of research, our objective is to see if the accounting information included in 3 databases present in France, is also concerned with what is called the "database" bias. The question is then to see what are the consequences of this bias in the empirical results obtained in accounting research applied to the French case. Firstable, we evidence that many items of the financial statements are significantly different between the 3 files involved. Three major sources of discrepancy are detected but the phenomenon of the errors is not significant. .
Juan, Sandrine. "Les modélisations économétriques d'estimation de coût dans l'industrie automobile : l'apport des techniques de bootstrap." Dijon, 1999. http://www.theses.fr/1999DIJOE023.
Full textMaiz, Sofiane. "Estimation et détection de signaux cyclostationnaires par les méthodes de ré-échantillonnage statistique : applications à l'analyse des signaux biomécaniques." Thesis, Saint-Etienne, 2014. http://www.theses.fr/2014STET4020/document.
Full textIn mechanical and biomechanical signal analysis field, the decision support tools are based on strong statistical assumptions such as: normality, stationarity of variables, independence... However, these assumptions are very often unverified, consequently, wrong decisions could be taken. This work aims to propose new methods that make abstractions of such assumptions, including the stationarity and gaussianity of variables. In this thesis, we revisited some statistical resampling methods and developed new bootstrap approaches with including the cyclostationary nature of signals. Next, we applied these methods to the analysis of biomechanical signals from experienced runners and a population of elderly people. The obtained results allowed us to demonstrate significant changes in the second order frequency content of the signals under study. These changes were very relevant indicators for the description and characterization of the fatigue of a high level professional runner. Moreover, these changes helped us to understand the mechanism of normal walking and under a cognitive task condition (double task walking) in elderly
Thai, Hoai Thu. "Développement de modèles mécanistiques et évaluation de l'incertitude des paramètres par bootstrap : application aux médicaments anti-angiogéniques." Paris 7, 2013. http://www.theses.fr/2013PA077025.
Full textAngiogenesis, the development of new blood vessels from pre-existing vasculator, is particularly mediated by vascular endothelial growth factor (VEGF), a therapeutic target of new anti-angiogenic drugs such as aflibercept (Zaltrap®). Because of the binding to VEGF, the pharmacokinetic (PK)/pharmacodynamic (PD) properties of this new drug become more complex. In this thesis, we have studied the mechanism of action of aflibercept by building population PK/PD models. We firstly developed the joint PK model of free and bound aflibercept in healthy subjects. We then applied this model to data in cancer patients, assessed the influence of physiopathologie factors on their PK and evaluated the choice of therapeutic dose by simulation. A PD model characterizing the effect of aflibercept on tumor growth was then built for patients with metastatic colorectal cancer. We also studied by simulation the contribution of bootstrap approach in nonlinear mixed-effects models (NLMEM) in estimating uncertainty of parameters. We have shown that the bootstraps only provide better estimates of uncertainty in NLMEM with high nonlinearity compared to the asymptotic method. The case bootstrap performs as well as the nonparametric bootstrap of both random effects and residuals. However, they may face practical problems, e. G skewed distributions in parameter estimates and unbalanced designs where stratification may be insufficient
Cavaignac, Laurent. "Les Fonctions distance en théorie de la production : applications en incertitude et aux tests non paramétriques." Perpignan, 2007. http://www.theses.fr/2007PERP0779.
Full textThis research focuses on production theory. It is based on the fundamental assumption that production units are potentially inefficient. It deals with distance functions which are widely used efficiency measurement tools in production microeconomics. This thesis is three fold. Its first part considers a world where the output level is uncertain for the producer even though the input quantities are known. In this context, we define a model with continuous random output variables. We also derive different decompositions of the allocative efficiency measures in this output contigent framework. The second part analyses the properties of technical efficiency measures. We list the required properties for directional efficiency measures and analyze whether the usual directional measures or some new ones exhibit these properties. We conclude this part by defining a measure that generalizes most of the known radial efficiency measures. The last chapters focus on production technologies nonparametric tests. We use and improve the Primont and Primont-1994 test to show how homotheticity and translation homotheticity tests can be performed using Simar and Wilson – 1999 bootstrap technique. We finally describe a new input congestion test and illustrate this result with an application to the air transport sector
Saiu, Luca. "GNU epsilon : an extensible programming language." Paris 13, 2012. http://scbd-sto.univ-paris13.fr/intranet/edgalilee_th_2012_saiu.pdf.
Full textReductionism is a viable strategy for designing and implementing practical programming languages, leading to solutions which are easier to extend, experiment with and formally analyze. We formally specify and implement an extensible programming language, based on a minimalistic first-order imperative core language plus strong abstraction mechanisms, reflection and self-modification features. The language can be extended to very high levels: by using Lisp-style macros and code-to-code transforms which automatically rewrite high-level expressions into core forms, we define closures and first-class continuations on top of the core. Non-self-modifying programs can be analyzed and formally reasoned upon, thanks to the language simple semantics. We formally develop a static analysis and prove a soundness property with respect to the dynamic semantics. We develop a parallel garbage collector suitable to multi-core machines to permit efficient execution of parallel programs
Walschaerts, Marie. "La santé reproductive de l'homme : méthodologie et statistique." Toulouse 3, 2011. http://thesesups.ups-tlse.fr/1470/.
Full textMale reproductive health is an indicator of his overall health. It is also closely linked to environmental exposures and living habits. Nowadays, surveillance of male fertility shows a secular decline in sperm quality and increased disease and malformations of the male reproductive tract. The objective of this work is to study the male reproductive health in an epidemiologic aspect and through various statistical tools. Initially, we were interested in the pathology of testicular cancer, its incidence and its risk factors. Then, we studied the population of men consulting for male infertility, their andrological examination, their therapeutic care and their parenthood project. Finally, the birth event was analyzed through survival models: the Cox model and the survival trees. We compared different methods of stable selection variables (the stepwise bootstrapped and the bootstrap penalisation L1 method based on Cox model, and the bootstrap node-level stabilization method and random survival forests) in order to obtain a final model easy to interpret and which improve prediction. In South of France, the incidence of testicular cancer doubled over the past 20 years. The birth cohort effect, i. E. The generational effect, suggests a hypothesis of a deleterious effect of environmental exposure on male reproductive health. However, the living environment of man during his adult life does not seem to be a potential risk factor for testicular cancer, suggesting hypothesis of exposure to endocrine disruptors in utero. The responsibility of man for difficulties in conceiving represents 50% of cases of infertility, making the management of male infertility essential. In our cohort, 85% of male partners presented an abnormal clinical examination (either a medical history or the presence of an anomaly in andrological examination). Finally, one in two couples who consulted for male infertility successfully had a child. The age of men over 35 appears to be a major risk factor, which should encourage couples to start their parenthood project earlier. Taking into account the survival time in the reproductive outcome of these infertile couples, the inclusion of large numbers of covariates gives models often unstable. We associated the bootstrap method to variables selection approaches. Although the method of Random Survival Forests is the best in the prediction performance, the results are not easily interpretable. Results are different according to the size of the sample. Based on the Cox model, the stepwise algorithm is inappropriate when the number of events is too small. The bootstrap node-level stabilization method does not seem better in prediction performance than a simple survival tree (difficulty to prune the tree). Finally, the Cox model based on selection variables with the penalisation L1 method seems a good compromise between interpretation and prediction
Abellan, Alexandre. "Construction de fonctions objectifs et modélisation a priori de l’apport de mesures en géosciences : une approche statistique." Strasbourg, 2009. https://publication-theses.unistra.fr/public/theses_doctorat/2009/ABELLAN_Alexandre_2009.pdf.
Full textDue to the increasing costs of investments involved in oil industry, reservoir characterization and uncertainty quantification are now major issues in oil and gas reservoir engineering. As it is impossible to get an exhaustive knowledge of the reservoir, a probabilistic description is now adopted. This description relies on a smaller number of parameters that need to be estimated using the data. In a second step, the model is history matched in order to be consistent with the observed dynamic data (e. G. Pressure at the well, fluid flow rates). This is done by minimizing an objective function that quantifies the discrepancy between the simulated and real data. Geological data is integrated when building the prior model. The bayesian framework allows to perform these tasks in a natural way. In a next step, using entropic considerations, we introduce the so called Kullback Leibler divergences that allows to quantify the "information content" of any potential measurements, relative to the considered model. A tractable algebraic expression is worked out when considering Gaussian models and data depending linearly on the model. Explicit tests were carried out in the case of kriging, well test interpretation and two phase flow context. The notion of data redundancy was also explored and quantified. The developed technique could help to design optimal data acquisition schemes
Berry, Vincent. "Méthodes et algorithmes pour reconstruire les arbres de l'Evolution." Montpellier 2, 1997. http://www.theses.fr/1997MON20246.
Full textMakany, Roger Armand. "Techniques de validation en statistiques : application à l'analyse en composantes principales et à la régression." Paris 11, 1985. http://www.theses.fr/1985PA112222.
Full textIn this work, the results validation is being studied according to the stability proceeding from these results. The present study is based upon the stability of latent roots and subspaces in principal component analysis. As regards regression, the author has focused on the stability of the regression coefficients. Beyond the stability criteria put forward, there has been included a display of the software designed for data processing
Marhaba, Bassel. "Restauration d'images Satellitaires par des techniques de filtrage statistique non linéaire." Thesis, Littoral, 2018. http://www.theses.fr/2018DUNK0502/document.
Full textSatellite image processing is considered one of the more interesting areas in the fields of digital image processing. Satellite images are subject to be degraded due to several reasons, satellite movements, weather, scattering, and other factors. Several methods for satellite image enhancement and restoration have been studied and developed in the literature. The work presented in this thesis, is focused on satellite image restoration by nonlinear statistical filtering techniques. At the first step, we proposed a novel method to restore satellite images using a combination between blind and non-blind restoration techniques. The reason for this combination is to exploit the advantages of each technique used. In the second step, novel statistical image restoration algorithms based on nonlinear filters and the nonparametric multivariate density estimation have been proposed. The nonparametric multivariate density estimation of posterior density is used in the resampling step of the Bayesian bootstrap filter to resolve the problem of loss of diversity among the particles. Finally, we have introduced a new hybrid combination method for image restoration based on the discrete wavelet transform (DWT) and the proposed algorithms in step two, and, we have proved that the performance of the combined method is better than the performance of the DWT approach in the reduction of noise in degraded satellite images
Hemery, Carine. "Prospectives à long terme de la consommation énergétique en transport de marchandises." Paris 1, 2009. http://www.theses.fr/2009PA010035.
Full textEnjolras, Geoffroy. "De l'assurabilité des catastrophes naturelles : Modélisation et application à l'assurance récolte." Montpellier 1, 2008. http://www.theses.fr/2008MON10020.
Full textNatural catastrophes are complex hazards which generate major losses, especially in the agricultural sector. As a result, their insurability is limited and the insurance market is incomplete. Facing these limitations, we develop an innovative combination of insurance and financial instruments adapted to the coverage of the different aspects of natural risks. Our application focuses on crop insurance which is being reformed in France. The thesis is organized in four main parts: the first part presents a survey of literature on catastrophic risks and the potential insurance and financial instruments. The second part exposes a case study on flood risk management at the river-basin scale. In a third part, we develop a theoretical model that tends to offer an optimal coverage against catastrophic risk. The fourth part consists in an empirical validation of the model performing regressions and a full set of tests
Condomines, Bérangère. "De la compétence individuelle à la performance individuelle : mise en évidence et discussion à partir de l'appréciation des managers par la méthode du centre d'évaluation." Paris 1, 2012. http://www.theses.fr/2012PA010041.
Full textShapira, Assaf. "Bootstrap percolation and kinetically constrained models in homogeneous and random environments." Thesis, Sorbonne Paris Cité, 2019. http://www.theses.fr/2019USPCC066.
Full textThis thesis concerns with Kinetically Constrained Models and Bootstrap Percolation, two topics in the intersection between probability, combinatorics and statistical mechanics. Kinetically constrained models were introduced by physicists in the 1980's to model the liquid-glass transition, whose understanding is still one of the big open questions in condensed matter physics. They have been studied extensively in the physics literature in the hope to shed some light on this problem, and in the last decade they have also received an increasing attention in the probability community. We will see that even though they belong to the well established field of interacting particle systems with stochastic dynamics, kinetically constrained models pose challenging and interesting problems requiring the development of new mathematical tools.Bootstrap percolation, on the other hand, is a class of monotone cellular automata, namely discrete in time and deterministic dynamics, the first example being the r-neighbor bootstrap percolation introduced in 1979. Since then, the study of bootstrap percolation has been an active domain in both the combinatorial and probabilistic communities, with several breakthroughs in the recent years.Though introduced in different contexts, kinetically constrained models and the bootstrap percolation, as we will see, are intimately related; and one may think of bootstrap percolation as a deterministic counterpart of kinetically constrained models, and of kinetically constrained models as the natural stochastic version of bootstrap percolation
Albertus, Mickael. "Processus empirique avec informations auxiliaires." Thesis, Toulouse 3, 2019. http://www.theses.fr/2019TOU30095.
Full textThis thesis deals with the study of the empirical process with auxiliary information, that is to say information that one would have a priori or that one would have obtained with a source of information. We show in this thesis how to modify the empirical process to take into account auxiliary information. We also show that providing auxiliary information at the empirical process level improves the quality of statistical estimates as well as the power of standard statistical tests. The first chapter contains the main definitions as well as the important results used in this thesis. In the second and third chapter, we study the particular case where the auxiliary information is respectively given by the probability of sets of one or more given partition(s). In particular, the third chapter focuses on the method of Raking-Ratio, a method widely used in statistics to combine the knowledge of the probability of sets of several partitions. In the fourth chapter, we generalize the definition of auxiliary information while retaining the possibility of establishing strong approximation results, at the cost of a loss of generalization. In the last chapter, we establish the strong approximation of the empirical process in the case of the bootstrap method and we combine the bootstrap method with that of the Raking-Ratio
Sellali, Brahim. "Intégration du retour d'expérience d'exploitation dans une approche MBF pour optimiser la fiabilité de matériel." Nancy 1, 1998. http://www.theses.fr/1998NAN10307.
Full textCornea, Adriana. "Bootstrap raffiné pour les lois stables parétiennes avec applications aux rendements des actifs financiers." Aix-Marseille 2, 2008. http://www.theses.fr/2008AIX24023.
Full textThe supremacy of the stable Paretian distributions over the Gaussian distributions is by now a stylized fact in the financial theory and pratice. It is well known that the asymptotic inference about the expected returns is not always reliable and the nonparametric bootstrap can be used as an alternative. However, several studies have emphazide the fact that the nonparametric bootstrap is invalid foe the stable Paretian distributions. The reason is that the expected returns are highly influenced by the risk of the investement opportunities, risk which is always greater in stable Paretian financial market than in a Gaussian market. The widely accepted solution to the nonparametric bootstrap failure is the m out of n bootstrap. But, in this thesis it is shown that the m out of n bootstrap can also fail to provide valid inference results in small sample and can perform worse. Than the nonparametric bootstrap. In addition, in this dissertation a refined bootstrap method that overcomes the drawbacks of both of the nonparametric bootstrap and the m out of m bootstrap, is introduced. Then, the behavior of the refined boostrap is investigated through a simulation study. Finally, its performance is illustrated using returns of hedge funds and jumps of the futures S&P500 index
Rosa, Vargas José Ismäel de la. "Estimation de la densité de probabilité d'une mesure dans un cadre non-linéaire, non-gaussien." Paris 11, 2002. http://www.theses.fr/2002PA112201.
Full textThe characterization and modeling of an indirect measurement procedure is led by a set of previously observed data. The modeling task is it self a complex procedure which is correlated with the measurement objective. Far from model building and model selection, a theoretical and practical problem persists: What is the correct probability density function (PDF) of a parametric model? Once this PDF is approximated, the next step is to establish a mechanism to propagate this statistical information until the quantity of interest. In fact, such a quantity is a measurement estimate and it is a nonlinear function of the parametric model. The present work proposes some different methods to make statistical inferences about the measurement estimate. We propose a first approach based on bootstrap methods. Such methods are classical in statistical simulation together with Monte Carlo methods, and they require a significative time of calcul. However, the precision over the measurement PDF estimated by these methods is very good. On the other hand, we have verified that the bootstrap methods convergence is faster than the Primitive Monte Carlo's one. Another advantage of bootstrap is its capacity to determine the statistical nature of errors which perturb the measurement system. This is doing thanks to the empirical estimation of the errors PDF. The bootstrap convergence optimization could be achieved by smoothing the residuals or by using a modified iterated bootstrap scheme. More over, we propose to use robust estimation when outliers are present. The second approach is based on other sampling techniques called Markov Chain Monte Carlo (MCMC), the statistical inference obtained when using these methods is very interesting, since we can use all a priori information about the measurement system. We can reformulate the problem solution by using the Bayes rule. The Gibbs sampling and the Metropolis-Hastings algorithms were exploited in this work. We overcome to the MCMC convergence optimization problem by using a weighted resampling and coupling from the past (CFTP) schemes, moreover, we adapt such techniques to the measurement PDF approximation. The last proposed approach is based on the use of kernel methods. The main idea is founded on the nonparametric estimation of the errors PDF, since it is supposed unknown. Then, we optimize a criterion function based on the entropy of the errors' PDF, thus we obtain a minimum entropy estimator (MEE). The simulation of this estimation process by means of Monte Carlo, MCMC, or weighted bootstrap could led to us to construct a statistical approximation of the measurement population. .
Andrieu, Guillaume. "Estimation par intervalle d'une distance évolutive." Montpellier 2, 1997. http://www.theses.fr/1997MON20233.
Full textSidi, Zakari Ibrahim. "Sélection de variables et régression sur les quantiles." Thesis, Lille 1, 2013. http://www.theses.fr/2013LIL10081/document.
Full textThis work is a contribution to the selection of statistical models and more specifically in the selection of variables in penalized linear quantile regression when the dimension is high. It focuses on two points in the selection process: the stability of selection and the inclusion of variables by grouping effect. As a first contribution, we propose a transition from the penalized least squares regression to quantiles regression (QR). A bootstrap approach based on frequency of selection of each variable is proposed for the construction of linear models (LM). In most cases, the QR approach provides more significant coefficients. A second contribution is to adapt some algorithms of "Random" LASSO (Least Absolute Shrinkage and Solution Operator) family in connection with the QR and to propose methods of selection stability. Examples from food security illustrate the obtained results. As part of the penalized QR in high dimension, the grouping effect property is established under weak conditions and the oracle ones. Two examples of real and simulated data illustrate the regularization paths of the proposed algorithms. The last contribution deals with variable selection for generalized linear models (GLM) using the nonconcave penalized likelihood. We propose an algorithm to maximize the penalized likelihood for a broad class of non-convex penalty functions. The convergence property of the algorithm and the oracle one of the estimator obtained after an iteration have been established. Simulations and an application to real data are also presented
Lerasle, Matthieu. "Rééchantillonnage et sélection de modèles optimale pour l'estimation de la densité." Toulouse, INSA, 2009. http://eprint.insa-toulouse.fr/archive/00000290/.
Full textBarros, Emanoel de Souza. "Estimations de frontières de production des exploitations agricoles par des approches paramétriques, non paramétriques et bootstrap : avec une application à l'évaluation des effets de l'irrigation dans le Nordeste du Brésil." Paris 1, 2006. http://www.theses.fr/2006PA010024.
Full textBanga, Mbom Calvin David. "L'approche bootstrap en analyse des images : application à la restitution de la cinétique de la fuite dans la choriorétinopathie séreuse centrale." Rennes 1, 1995. http://www.theses.fr/1995REN10016.
Full textThis work concerns the medical image analysis using statistical pattern recognition methods. These methods usually lead to classification and multivariage segmentation approaches that are more and more used in medical imaging for qualitative or quantitative analysis of tissue. However, statistical pattern reconigtion methods are well known to need high computing time. This assumption is not suitable for image sequences analysis. To get rid of this important problem, we suggest a bootstrap approach for two-dimensional image analysis. This model is based on the random sampling of blocks of observations in the original image. The correlation relationships are maintained in a given block, while the selected blocks are each other independent. Given an original image, we randomly select a small representative set of observations for image statistical parameters estimation. In this way, the bootstrap model improves the estimation and reduces the computing time due to the complexity of estimation algorithms used in statistical pattern recognition. The problem of the quantification of the serum leak kinetic in the Central Serous Choroiditis pathology is and important application in wich the bootstrap model we propose is fro a great need. The serum leak surface is mesures from the eye's fundus images sequence after an unsupervised segmentation sept using the bootstrap random sampling model wich we have proposed. Both a high-quality segmentation and a great reduction of etimation time are required for this application. A graph showing the serum leak surface versus the Fluorecein inection time is then plotted for helping the decision in ophtalmology. The bootstrap model wich we propose for image analysis shows the way to use the bootstrap approach for statistical parameters estimation in pattern recognition, notably in image analysis by invariance methods, in texture analysis and multivariate segmentation
Berthaume, Élodie. "L' annonce majeure d'une fusion par deux acteurs majeurs d'un secteur : la réaction des cours de bourse des outsiders." Paris 1, 2011. http://www.theses.fr/2011PA010051.
Full textLenormand, Anne. "Prévisions dans les modèles cointégrés avec rupture : application à la demande de transports terrestres de marchandises et de voyageurs." Paris 1, 2002. http://www.theses.fr/2002PA010007.
Full textEssid, Hédi. "L'induction statistique dans le modèle DEA avec inputs quasi-fixes : développements théoriques et une application au secteur de l'éducation tunisien." Lille 1, 2007. https://pepite-depot.univ-lille.fr/LIBRE/Th_Num/2007/50374-2007-Essid.pdf.
Full textAlbert, Mélisande. "Tests d’indépendance par bootstrap et permutation : étude asymptotique et non-asymptotique. Application en neurosciences." Thesis, Nice, 2015. http://www.theses.fr/2015NICE4079/document.
Full textOn the one hand, we construct such tests based on bootstrap and permutation approaches. Their asymptotic performance are studied in a point process framework through the analysis of the asymptotic behavior of the conditional distributions of both bootstrapped and permuted test statistics, under the null hypothesis as well as under any alternative. A simulation study is performed verifying the usability of these tests in practice, and comparing them to existing classical methods in Neuroscience. We then focus on the permutation tests, well known for their non-asymptotic level properties. Their p-values, based on the delayed coincidence count, are implemented in a multiple testing procedure, called Permutation Unitary Events method, to detect the synchronization occurrences between two neurons. The practical validity of the method is verified on a simulation study before being applied on real data. On the other hand, the non-asymptotic performances of the permutation tests are studied in terms of uniform separation rates. A new aggregated procedure based on a wavelet thresholding method is developed in the density framework. Based on Talagrand's fundamental inequalities, we provide a new Bernstein-type concentration inequality for randomly permuted sums. In particular, it allows us to upper bound the uniform separation rate of the aggregated procedure over weak Besov spaces and deduce that this procedure seems to be optimal and adaptive in the minimax sens
Wendt, Herwig. "Contributions of Wavelet Leaders and Bootstrap to Multifractal Analysis : Images, estimation performance, dependence structure and vanishing moments. Confidence Intervals and Hypothesis Tests." Lyon, École normale supérieure (sciences), 2008. http://www.theses.fr/2008ENSL0474.
Full textThis thesis studies the benefits of the use of wavelet Leaders and bootstrap methods for multifractal analysis. The statistical properties of wavelet Leader based multifractal analysis procedures are characterized, and the extension to images is validated. Certain theoretical questions of crucial practical importance are investigated: minimum regularity, function space embedding, linearization effect. The proposed bootstrap procedures permit the construction of confidence intervals and hypothesis tests from one single finite length observation of data. This is achieved by an original time-scale block bootstrap approach in the wavelet domain. The study of the dependence structures of wavelet coefficients of multiplicative cascades shows that the number of vanishing moments of the analyzing wavelet is ineffective for reducing the long range dependence structure. The multifractal analysis procedures are applied to hydrodynamic turbulence data, and to texture image classification
Amegble, Koami Dzigbodi. "Tests non paramétriques de spécification pour densité conditionnelle : application à des modèles de choix discret." Master's thesis, Université Laval, 2015. http://hdl.handle.net/20.500.11794/25773.
Full textDans ce travail, nous étudions la performance statistique (taille et puissance) en échantillon fini de deux tests non paramétriques de spécification pour densité conditionnelle proposés par Fan et al. (2006) et Li et Racine (2013). Ces tests permettent de vérifier si les probabilités conditionnelles postulées dans les modèles de choix discret (logit/probit multinomial à effets fixes ou aléatoires, estimateur de Klein et Spady (1993), etc) représentent correctement les choix observés. Par rapport aux tests existants, cette approche a l’avantage d’offrir une forme fonctionnelle flexible alternative au modèle paramétrique lorsque ce dernier se révèle mal spécifié. Ce modèle alternatif est directement issu de la procédure de test et il correspond au modèle non contraint obtenu par des produits de noyaux continus et discrets. Les deux tests explorés ont une puissance en échantillon fini supérieure aux tests existants. Cette performance accrue s’obtient en combinant une procédure bootstrap et l’utilisation de paramètres de lissage des fonctions noyaux par validation croisée par les moindres carrés. Dans notre application, nous parallélisons les calculs de taille et de puissance, ainsi que l’estimation des fenêtres de lissage, sur un serveur multi-processeurs (Colosse, de Calcul Québec). Nous utilisons des routines "Open MPI" pré-implémentées dans R. Par rapport aux simulations effectuées dans les articles originaux, nous postulons des modèles plus proches de ceux habituellement utilisés dans la recherche appliquée (logit et probit à variance unitaire notamment). Les résultats des simulations confirment les bonnes taille et puissance des tests en échantillon fini. Par contre, les gains additionnels de puissance de la statistique lissée proposée par Li et Racine (2013) se révèlent négligeables dans nos simulations. Mots clés : Bootstrap, choix discret, densité conditionnelle, Monte Carlo, produit de noyaux, puissance, taille.
Depoorter, Nicolas. "Développement de méthodes de simulation pour l'analyse du comportement aux dommages de résines époxydes." Valenciennes, 2005. http://ged.univ-valenciennes.fr/nuxeo/site/esupversions/3e942ed0-5c1f-41f5-bd33-d3f1daea5985.
Full textThe demand for high-ends product is continuously rising. Therefore, it is more than ever important to be able to predict the deterioration under service conditions. The focus of this work lies on a toughened epoxy resin used for electronic packaging with the scope of investigating the lifetime of this material with the help of continuum damage mechanics. For this purpose, strain- controlled fatigue experiments at several temperatures were performed, and a method to extract damage from the strain- stress hysteresis curve is suggested. Tests were performed on annealed samples in order to minimize thermally activated ageing effects. The lifetime of the experiments was analyzed via Weibull analysis, and in order to determine the uncertainty, the bootstrap method was applied. Thanks to this powerful tool, it is possible to calculate a 90% confidence interval, which gives more information on the uncertainty of the lifetime. Further, the fatigue experiments were simulated with a linear viscoelastic damaged model. Three damage models are suggested which are based on a viscoplastic damage model proposed by Lemaitre. In order to identify the model parameter is by reverse engineering, an innovative method has been used in this work. Indeed, the Design of Experiment (DoE) provides a systematic study of the influence of the model parameters on the damage variable. Hence, it was possible to determine the best model to fit the experiments
Ferfache, Anouar Abdeldjaoued. "Les M-estimateurs semiparamétriques et leurs applications pour les problèmes de ruptures." Thesis, Compiègne, 2021. http://www.theses.fr/2021COMP2643.
Full textIn this dissertation we are concerned with semiparametric models. These models have success and impact in mathematical statistics due to their excellent scientific utility and intriguing theoretical complexity. In the first part of the thesis, we consider the problem of the estimation of a parameter θ, in Banach spaces, maximizing some criterion function which depends on an unknown nuisance parameter h, possibly infinite-dimensional. We show that the m out of n bootstrap, in a general setting, is weakly consistent under conditions similar to those required for weak convergence of the non smooth M-estimators. In this framework, delicate mathematical derivations will be required to cope with estimators of the nuisance parameters inside non-smooth criterion functions. We then investigate an exchangeable weighted bootstrap for function-valued estimators defined as a zero point of a function-valued random criterion function. The main ingredient is the use of a differential identity that applies when the random criterion function is linear in terms of the empirical measure. A large number of bootstrap resampling schemes emerge as special cases of our settings. Examples of applications from the literature are given to illustrate the generality and the usefulness of our results. The second part of the thesis is devoted to the statistical models with multiple change-points. The main purpose of this part is to investigate the asymptotic properties of semiparametric M-estimators with non-smooth criterion functions of the parameters of multiple change-points model for a general class of models in which the form of the distribution can change from segment to segment and in which, possibly, there are parameters that are common to all segments. Consistency of the semiparametric M-estimators of the change-points is established and the rate of convergence is determined. The asymptotic normality of the semiparametric M-estimators of the parameters of the within-segment distributions is established under quite general conditions. We finally extend our study to the censored data framework. We investigate the performance of our methodologies for small samples through simulation studies
Kandji, Baye Matar. "Stochastic recurrent equations : structure, statistical inference, and financial applications." Electronic Thesis or Diss., Institut polytechnique de Paris, 2023. http://www.theses.fr/2023IPPAG004.
Full textWe are interested in the theoretical properties of Stochastic Recurrent Equations (SRE) and their applications in finance. These models are widely used in econometrics, including financial econometrics, to explain the dynamics of various processes such as the volatility of financial returns. However, the probability structure and statistical properties of these models are still not well understood, especially when the model is considered in infinite dimensions or driven by non-independent processes. These two features lead to significant difficulties in the theoretical study of these models. In this context, we aim to explore the existence of stationary solutions and the statistical and probabilistic properties of these solutions.We establish new properties on the trajectory of the stationary solution of SREs, which we use to study the asymptotic properties of the quasi-maximum likelihood estimator (QMLE) of GARCH-type (generalized autoregressive conditional heteroskedasticity) conditional volatility models. In particular, we study the stationarity and statistical inference of semi-strong GARCH(p,q) models where the innovation process is not necessarily independent. We establish the consistency of the QMLE of semi-strong GARCHs without assuming the commonly used condition that the stationary distribution admits a small-order moment. In addition, we are interested in the two-factor volatility GARCH models (GARCH-MIDAS); a long-run, and a short-run volatility. These models were recently introduced by Engle et al. (2013) and have the particularity to admit stationary solutions with heavy-tailed distributions. These models are now widely used but their statistical properties have not received much attention. We show the consistency and asymptotic normality of the QMLE of the GARCH-MIDAS models and provide various test procedures to evaluate the presence of long-run volatility in these models. We also illustrate our results with simulations and applications to real financial data.Finally, we extend a result of Kesten (1975) on the growth rate of additive sequences to superadditive processes. From this result, we derive generalizations of the contraction property of random matrices to products of stochastic operators. We use these results to establish necessary and sufficient conditions for the existence of stationary solutions of the affine case with positive coefficients of SREs in the space of continuous functions. This class of models includes most conditional volatility models, including functional GARCHs
Magnanensi, Jérémy. "Amélioration et développement de méthodes de sélection du nombre de composantes et de prédicteurs significatifs pour une régression PLS et certaines de ses extensions à l'aide du bootstrap." Thesis, Strasbourg, 2015. http://www.theses.fr/2015STRAJ082/document.
Full textThe Partial Least Squares (PLS) regression, through its properties, has become a versatile statistic methodology for the analysis of genomic datasets.The reliability of the PLS regression and some of its extensions relies on a robust determination of a tuning parameter, the number of components. Such a determination is still a major aim since no existing criterion could be considered as a global benchmark one in the state-of-art literature. We developed a new bootstrap based stopping criterion in PLS components construction that guarantee a high level of stability. We then adapted and used it to develop and improve variable selection processes, allowing a more reliable and robust determination of significant probe sets related to the studied feature of a pathology