Dissertations / Theses on the topic 'Non-normality'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Non-normality.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Hristova-Bojinova, Daniela. "Non-normality and non-linearity in univariate standard models of inflation." Thesis, University of Leicester, 2002. http://hdl.handle.net/2381/30141.
Full textGordon, Carol J. (Carol Jean). "The Robustness of O'Brien's r Transformation to Non-Normality." Thesis, North Texas State University, 1985. https://digital.library.unt.edu/ark:/67531/metadc332002/.
Full textWu, Yinkai. "Non-normality, uncertainty and inflation forecasting : an analysis of China's inflation." Thesis, University of Leicester, 2016. http://hdl.handle.net/2381/37175.
Full textChuenpibal, Tanitpong. "If I pick up non-normality, can robust models make it better?" Thesis, University of Exeter, 2006. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.434877.
Full textSun, Qi. "Finite sample distributions and non-normality in second generation panel unit root tests." Thesis, University of Leicester, 2010. http://hdl.handle.net/2381/8929.
Full textShutes, Karl. "Non-normality in asset pricing- extensions and applications of the skew-normal distribution." Thesis, University of Sheffield, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.419870.
Full textLau, Christian [Verfasser], Jörg [Akademischer Betreuer] Laitenberger, and Claudia [Akademischer Betreuer] Becker. "Non-normality in financial markets and the measurement of risk / Christian Lau. Betreuer: Jörg Laitenberger ; Claudia Becker." Halle, Saale : Universitäts- und Landesbibliothek Sachsen-Anhalt, 2015. http://d-nb.info/1078505004/34.
Full textTano, Bask Andreas, and Johan Jaurin. "Det elliptiska säkerhetsområdets robusthet : hur robust är metoden med de elliptiska säkerhetsområdena förett symmetriskt men icke normalfördelat datamaterial?" Thesis, Umeå University, Department of Statistics, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-34821.
Full textQuality Control is a term often used within production and is referring to managing processes so they produce capable products. Within Quality Control, process capability index is a common measure to oversee processes. Safety Region Plots were introduced to do this graphically. In Albing & Vännman (2010) the concept of Safety Region Plots is expanded to incorporate an elliptical shape. The method of Elliptical Safety Region Plots assumes a normally distributed data. In this paper we are looking at the robustness of the Elliptical Safety Region Plots if we can assume a symmetrically, but non-normal, distribution. In the results we can conclude that an adjustment is required for symmetric, but non-normal, data if the method in Albing & Vännman (2010) is going to be used. An eventual adjustment is discussed in discussions. To easily be able to use the Elliptical Safety Region Plots mentioned in Albing & Vännman (2010) we have developed a program in RExcel.
Rogers, Catherine Jane. "Power comparisons of four post-MANOVA tests under variance-covariance heterogeneity and non-normality in the two group case." Diss., Virginia Tech, 1994. http://hdl.handle.net/10919/40171.
Full textJoo, Seang-Hwane. "Robustness of the Within- and Between-Series Estimators to Non-Normal Multiple-Baseline Studies: A Monte Carlo Study." Scholar Commons, 2017. http://scholarcommons.usf.edu/etd/6715.
Full textDonmez, Ayca. "Adaptive Estimation And Hypothesis Testing Methods." Phd thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/3/12611724/index.pdf.
Full texts maximum likelihood estimators (MLEs) are commonly used. They are consistent, unbiased and efficient, at any rate for large n. In most situations, however, MLEs are elusive because of computational difficulties. To alleviate these difficulties, Tiku&rsquo
s modified maximum likelihood estimators (MMLEs) are used. They are explicit functions of sample observations and easy to compute. They are asymptotically equivalent to MLEs and, for small n, are equally efficient. Moreover, MLEs and MMLEs are numerically very close to one another. For calculating MLEs and MMLEs, the functional form of the underlying distribution has to be known. For machine data processing, however, such is not the case. Instead, what is reasonable to assume for machine data processing is that the underlying distribution is a member of a broad class of distributions. Huber assumed that the underlying distribution is long-tailed symmetric and developed the so called M-estimators. It is very desirable for an estimator to be robust and have bounded influence function. M-estimators, however, implicitly censor certain sample observations which most practitioners do not appreciate. Tiku and Surucu suggested a modification to Tiku&rsquo
s MMLEs. The new MMLEs are robust and have bounded influence functions. In fact, these new estimators are overall more efficient than M-estimators for long-tailed symmetric distributions. In this thesis, we have proposed a new modification to MMLEs. The resulting estimators are robust and have bounded influence functions. We have also shown that they can be used not only for long-tailed symmetric distributions but for skew distributions as well. We have used the proposed modification in the context of experimental design and linear regression. We have shown that the resulting estimators and the hypothesis testing procedures based on them are indeed superior to earlier such estimators and tests.
Pospíšil, Tomáš. "STOCHASTIC MODELING OF COMPOSITE MATERIALS." Doctoral thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2010. http://www.nusl.cz/ntk/nusl-233889.
Full textUddin, Mohammad Moin. "ROBUST STATISTICAL METHODS FOR NON-NORMAL QUALITY ASSURANCE DATA ANALYSIS IN TRANSPORTATION PROJECTS." UKnowledge, 2011. http://uknowledge.uky.edu/gradschool_diss/153.
Full textYilmaz, Yildiz Elif. "Experimental Design With Short-tailed And Long-tailed Symmetric Error Distributions." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/12605191/index.pdf.
Full textKolli, Kranthi Kumar. "Domain Effects in the Finite / Infinite Time Stability Properties of a Viscous Shear Flow Discontinuity." Connect to this title, 2008. http://scholarworks.umass.edu/theses/204/.
Full textYilmaz, Yildiz Elif. "Bayesian Learning Under Nonnormality." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/3/12605582/index.pdf.
Full textHerrington, Richard S. "Simulating Statistical Power Curves with the Bootstrap and Robust Estimation." Thesis, University of North Texas, 2001. https://digital.library.unt.edu/ark:/67531/metadc2846/.
Full textKhalil, Nathalie. "Conditions d'optimalité pour des problèmes en contrôle optimal et applications." Thesis, Brest, 2017. http://www.theses.fr/2017BRES0095/document.
Full textThe project of this thesis is twofold. The first concerns the extension of previous results on necessary optimality conditions for state constrained problems in optimal control and in calculus of variations. The second aim consists in working along two new research lines: derive viability results for a class of control systems with state constraints in which ‘standard inward pointing conditions’ are violated; and establish necessary optimality conditions for average cost minimization problems possibly perturbed by unknown parameters.In the first part, we examine necessary optimality conditions which play an important role in finding candidates to be optimal solutions among all admissible solutions. However, in dynamic optimization problems with state constraints, some pathological situations might arise. For instance, it might occur that the multiplier associated with the objective function (to minimize) vanishes. In this case, the objective function to minimize does not intervene in first order necessary conditions: this is referred to as the abnormal case. A worse phenomenon, called the degenerate case shows that in some circumstances the set of admissible trajectories coincides with the set of candidates to be minimizers. Therefore the necessary conditions give no information on the possible minimizers.To overcome these difficulties, new additional hypotheses have to be imposed, known as constraint qualifications. We investigate these two issues (normality and non-degeneracy) for optimal control problems involving state constraints and dynamics expressed as a differential inclusion, when the minimizer has its left end-point in a region where the state constraint set in nonsmooth. We prove that under an additional information involving mainly the Clarke tangent cone, necessary conditions in the form of the Extended Euler-Lagrange condition are derived in the normal and non-degenerate form for two different classes of state constrained optimal control problems. Application of the normality result is shown also for the calculus of variations problem subject to a state constraint.In the second part of the thesis, we consider first a class of state constrained control systems for which standard ‘first order’ constraint qualifications are not satisfied, but a higher (second) order constraint qualification is satisfied. We propose a new construction for feasible trajectories (a viability result) and we investigate examples (such as the Brockett nonholonomic integrator) providing in addition a non-linear stimate result. The other topic of the second part of the thesis concerns the study of a class of optimal control problems in which uncertainties appear in the data in terms of unknown parameters. Taking into consideration an average cost criterion, a crucial issue is clearly to be able to characterize optimal controls independently of the unknown parameter action: this allows to find a sort of ‘best compromise’ among all the possible realizations of the control system as the parameter varies. For this type of problems, we derive necessary optimality conditions in the form of Maximum Principle (possibly nonsmooth)
Hafsa, Houda. "Modèles d'évaluation et d'allocations des actifs financiers dans le cadre de non normalité des rendements : essais sur le marché français." Thesis, Aix-Marseille, 2012. http://www.theses.fr/2012AIXM1015.
Full textThis dissertation is part of an ongoing researches looking for an adequate model that apprehend the behavior of financial asset returns. Through this research, we propose to analyze the relevance of risk measures that take into account the non-normality in the asset pricing and portfolio allocation models on the French market. This dissertation is comprised of three articles. The first one proposes to revisit the asset pricing model taking into account the higher-order moments in a downside framework. The results indicate that the downside higher order co-moments are relevant in explaining the cross sectional variations of returns. The second paper examines the relation between expected returns and the VaR or CVaR. A cross sectional analysis provides evidence that VaR is superior measure of risk when compared to the CVaR. We find also that the normal estimation approach gives better results than the approach based on the expansion of Cornish-Fisher (1937). Both results contradict the theoretical predictions but we proved that they are inherent to the French market. In the third paper, we review the mean-CVaR model in a dynamic framework and we take into account the transaction costs. The results indicate that the asset allocation model that takes into account the non-normality can improve the performance of the portfolio comparing to the mean-variance model, in terms of the average return and the return-to CVaR ratio. Through these three studies, we think that it is possible to modify the risk management framework to apprehend in a better way the risk of loss associated to the non-normality problem
Ferrani, Yacine. "Sur l'estimation non paramétrique de la densité et du mode dans les modèles de données incomplètes et associées." Thesis, Littoral, 2014. http://www.theses.fr/2014DUNK0370/document.
Full textThis thesis deals with the study of asymptotic properties of e kernel (Parzen-Rosenblatt) density estimate under associated and censored model. In this setting, we first recall with details the existing results, studied in both i.i.d. and strong mixing condition (α-mixing) cases. Under mild standard conditions, it is established that the strong uniform almost sure convergence rate, is optimal. In the part dedicated to the results of this thesis, two main and original stated results are presented : the first result concerns the strong uniform consistency rate of the studied estimator under association hypothesis. The main tool having permitted to achieve the optimal speed, is the adaptation of the Theorem due to Doukhan and Neumann (2007), in studying the term of fluctuations (random part) of the gap between the considered estimator and the studied parameter (density). As an application, the almost sure convergence of the kernel mode estimator is established. The stated results have been accepted for publication in Communications in Statistics-Theory & Methods ; The second result establishes the asymptotic normality of the estimator studied under the same model and then, constitute an extension to the censored case, the result stated by Roussas (2000). This result is submitted for publication
Benelmadani, Djihad. "Contribution à la régression non paramétrique avec un processus erreur d'autocovariance générale et application en pharmacocinétique." Thesis, Université Grenoble Alpes (ComUE), 2019. http://www.theses.fr/2019GREAM034/document.
Full textIn this thesis, we consider the fixed design regression model with repeated measurements, where the errors form a process with general autocovariance function, i.e. a second order process (stationary or nonstationary), with a non-differentiable covariance function along the diagonal. We are interested, among other problems, in the nonparametric estimation of the regression function of this model.We first consider the well-known kernel regression estimator proposed by Gasser and Müller. We study its asymptotic performance when the number of experimental units and the number of observations tend to infinity. For a regular sequence of designs, we improve the higher rates of convergence of the variance and the bias. We also prove the asymptotic normality of this estimator in the case of correlated errors.Second, we propose a new kernel estimator of the regression function based on a projection property. This estimator is constructed through the autocovariance function of the errors, and a specific function belonging to the Reproducing Kernel Hilbert Space (RKHS) associated to the autocovariance function. We study its asymptotic performance using the RKHS properties. These properties allow to obtain the optimal convergence rate of the variance. We also prove its asymptotic normality. We show that this new estimator has a smaller asymptotic variance then the one of Gasser and Müller. A simulation study is conducted to confirm this theoretical result.Third, we propose a new kernel estimator for the regression function. This estimator is constructed through the trapezoidal numerical approximation of the kernel regression estimator based on continuous observations. We study its asymptotic performance, and we prove its asymptotic normality. Moreover, this estimator allow to obtain the asymptotic optimal sampling design for the estimation of the regression function. We run a simulation study to test the performance of the proposed estimator in a finite sample set, where we see its good performance, in terms of Integrated Mean Squared Error (IMSE). In addition, we show the reduction of the IMSE using the optimal sampling design instead of the uniform design in a finite sample set.Finally, we consider an application of the regression function estimation in pharmacokinetics problems. We propose to use the nonparametric kernel methods, for the concentration-time curve estimation, instead of the classical parametric ones. We prove its good performance via simulation study and real data analysis. We also investigate the problem of estimating the Area Under the concentration Curve (AUC), where we introduce a new kernel estimator, obtained by the integration of the regression function estimator. We prove, using a simulation study, that the proposed estimators outperform the classical one in terms of Mean Squared Error. The crucial problem of finding the optimal sampling design for the AUC estimation is investigated using the Generalized Simulating Annealing algorithm
Manandhar, Binod. "Bayesian Models for the Analyzes of Noisy Responses From Small Areas: An Application to Poverty Estimation." Digital WPI, 2017. https://digitalcommons.wpi.edu/etd-dissertations/188.
Full textChandler, Gary James. "Sensitivity analysis of low-density jets and flames." Thesis, University of Cambridge, 2011. https://www.repository.cam.ac.uk/handle/1810/246531.
Full textEl, Heda Khadijetou. "Choix optimal du paramètre de lissage dans l'estimation non paramétrique de la fonction de densité pour des processus stationnaires à temps continu." Thesis, Littoral, 2018. http://www.theses.fr/2018DUNK0484/document.
Full textThe work this thesis focuses on the choice of the smoothing parameter in the context of non-parametric estimation of the density function for stationary ergodic continuous time processes. The accuracy of the estimation depends greatly on the choice of this parameter. The main goal of this work is to build an automatic window selection procedure and establish asymptotic properties while considering a general dependency framework that can be easily used in practice. The manuscript is divided into three parts. The first part reviews the literature on the subject, set the state of the art and discusses our contribution in within. In the second part, we design an automatical method for selecting the smoothing parameter when the density is estimated by the Kernel method. This choice stemming from the cross-validation method is asymptotically optimal. In the third part, we establish an asymptotic properties pertaining to consistency with rate for the resulting estimate of the window-width
Lacombe, Jean-Pierre. "Analyse statistique de processus de poisson non homogènes. Traitement statistique d'un multidétecteur de particules." Phd thesis, Grenoble 1, 1985. http://tel.archives-ouvertes.fr/tel-00318875.
Full textPurutcuoglu, Vilda. "Unit Root Problems In Time Series Analysis." Master's thesis, METU, 2004. http://etd.lib.metu.edu.tr/upload/2/12604701/index.pdf.
Full textstatistic is not accurate even if the sample size is very large. Hence,Wiener process is invoked to obtain the asymptotic distribution of the LSE under normality. The first four moments of under normality have been worked out for large n. In 1998, Tiku and Wong proposed the new test statistics and whose type I error and power values are calculated by using three &ndash
moment chi &ndash
square or four &ndash
moment F approximations. The test statistics are based on the modified maximum likelihood estimators and the least square estimators, respectively. They evaluated the type I errors and the power of these tests for a family of symmetric distributions (scaled Student&rsquo
s t). In this thesis, we have extended this work to skewed distributions, namely, gamma and generalized logistic.
Servien, Rémi. "Estimation de régularité locale." Phd thesis, Université Montpellier II - Sciences et Techniques du Languedoc, 2010. http://tel.archives-ouvertes.fr/tel-00730491.
Full textCaron, Emmanuel. "Comportement des estimateurs des moindres carrés du modèle linéaire dans un contexte dépendant : Étude asymptotique, implémentation, exemples." Thesis, Ecole centrale de Nantes, 2019. http://www.theses.fr/2019ECDN0036.
Full textIn this thesis, we consider the usual linear regression model in the case where the error process is assumed strictly stationary.We use a result from Hannan (1973) who proved a Central Limit Theorem for the usual least squares estimator under general conditions on the design and on the error process. Whatever the design and the error process satisfying Hannan’s conditions, we define an estimator of the asymptotic covariance matrix of the least squares estimator and we prove its consistency under very mild conditions. Then we show how to modify the usual tests on the parameter of the linear model in this dependent context. We propose various methods to estimate the covariance matrix in order to correct the type I error rate of the tests. The R package slm that we have developed contains all of these statistical methods. The procedures are evaluated through different sets of simulations and two particular examples of datasets are studied. Finally, in the last chapter, we propose a non-parametric method by penalization to estimate the regression function in the case where the errors are Gaussian and correlated
Kabui, Ali. "Value at risk et expected shortfall pour des données faiblement dépendantes : estimations non-paramétriques et théorèmes de convergences." Phd thesis, Université du Maine, 2012. http://tel.archives-ouvertes.fr/tel-00743159.
Full textBoyer, Germain. "Étude de stabilité et simulation numérique de l’écoulement interne des moteurs à propergol solide simplifiés." Thesis, Toulouse, ISAE, 2012. http://www.theses.fr/2012ESAE0029/document.
Full textThe current work deals with the modeling of the hydrodynamic instabilities that play a major role in the triggering of the Pressure Oscillations occurring in large segmented solid rocket motors. These instabilities are responsible for the emergence of Parietal Vortex Shedding (PVS) and they interact with the boosters acoustics. They are first modeled as eigenmodes of the internal steady flowfield of a cylindrical duct with sidewall injection within the global linear stability theory framework. Assuming that the related parietal structures emerge from a baseflow disturbance, discrete meshindependant eigenmodes are computed. In this purpose, a multi-domain spectral collocation technique is implemented in a parallel solver to tackle numerical issues such as the eigenfunctions polynomial axial amplification and the existence of boundary layers. The resulting eigenvalues explicitly depend on the location of the boundaries, namely those of the baseflow disturbance and the duct exit, and are then validated by performing Direct Numerical Simulations. First, they successfully describe flow response to an initial disturbance with sidewall velocity injection break. Then, the simulated forced response to acoustics consists in vortical structures wihich discrete frequencies that are in good agreement with those of the eigenmodes. These structures are reflected into upstream pressure waves with identical frequencies. Finally, the PVS, which response to a compressible forcing such as the acoustic one is linear, is understood as the driving phenomenon of the Pressure Oscillations thanks to both numerical simulation and stability theory
Ahmad, Ali. "Contribution à l'économétrie des séries temporelles à valeurs entières." Thesis, Lille 3, 2016. http://www.theses.fr/2016LIL30059/document.
Full textThe framework of this PhD dissertation is the conditional mean count time seriesmodels. We propose the Poisson quasi-maximum likelihood estimator (PQMLE) for the conditional mean parameters. We show that, under quite general regularityconditions, this estimator is consistent and asymptotically normal for a wide classeof count time series models. Since the conditional mean parameters of some modelsare positively constrained, as, for example, in the integer-valued autoregressive (INAR) and in the integer-valued generalized autoregressive conditional heteroscedasticity (INGARCH), we study the asymptotic distribution of this estimator when the parameter lies at the boundary of the parameter space. We deduce a Waldtype test for the significance of the parameters and another Wald-type test for the constance of the conditional mean. Subsequently, we propose a robust and general goodness-of-fit test for the count time series models. We derive the joint distribution of the PQMLE and of the empirical residual autocovariances. Then, we deduce the asymptotic distribution of the estimated residual autocovariances and also of a portmanteau test. Finally, we propose the PQMLE for estimating, equation-by-equation (EbE), the conditional mean parameters of a multivariate time series of counts. By using slightly different assumptions from those given for PQMLE, we show the consistency and the asymptotic normality of this estimator for a considerable variety of multivariate count time series models
Reding, Lucas. "Contributions au théorème central limite et à l'estimation non paramétrique pour les champs de variables aléatoires dépendantes." Thesis, Normandie, 2020. http://www.theses.fr/2020NORMR049.
Full textThis thesis deals with the central limit theorem for dependent random fields and its applications to nonparametric statistics. In the first part, we establish some quenched central limit theorems for random fields satisfying a projective condition à la Hannan (1973). Functional versions of these theorems are also considered. In the second part, we prove the asymptotic normality of kernel density and regression estimators for strongly mixing random fields in the sense of Rosenblatt (1956) and for weakly dependent random fields in the sense of Wu (2005). First, we establish the result for the kernel regression estimator introduced by Elizbar Nadaraya (1964) and Geoffrey Watson (1964). Then, we extend these results to a large class of recursive estimators defined by Peter Hall and Prakash Patil (1994)
Bassene, Aladji. "Contribution à la modélisation spatiale des événements extrêmes." Thesis, Lille 3, 2016. http://www.theses.fr/2016LIL30039/document.
Full textIn this thesis, we investigate nonparametric modeling of spatial extremes. Our resultsare based on the main result of the theory of extreme values, thereby encompass Paretolaws. This framework allows today to extend the study of extreme events in the spatialcase provided if the asymptotic properties of the proposed estimators satisfy the standardconditions of the Extreme Value Theory (EVT) in addition to the local conditions on thedata structure themselves. In the literature, there exists a vast panorama of extreme events models, which are adapted to the structures of the data of interest. However, in the case ofextreme spatial data, except max-stables models, little or almost no models are interestedin non-parametric estimation of the tail index and/or extreme quantiles. Therefore, weextend existing works on estimating the tail index and quantile under independent ortime-dependent data. The specificity of the methods studied resides in the fact that theasymptotic results of the proposed estimators take into account the spatial dependence structure of the relevant data, which is far from trivial. This thesis is then written in thecontext of spatial statistics of extremes. She makes three main contributions.• In the first contribution of this thesis, we propose a new approach of the estimatorof the tail index of a heavy-tailed distribution within the framework of spatial data. This approach relies on the estimator of Hill (1975). The asymptotic properties of the estimator introduced are established when the spatial process is adequately approximated by aspatial M−dependent process, spatial linear causal process or when the process satisfies a strong mixing condition.• In practice, it is often useful to link the variable of interest Y with covariate X. Inthis situation, the tail index depends on the observed value x of the covariate X and theunknown fonction (.) will be called conditional tail index. In most applications, the tailindexof an extreme value is not the main attraction, but it is used to estimate for instance extreme quantiles. The contribution of this chapter is to adapt the estimator of the tail index introduced in the first part in the conditional framework and use it to propose an estimator of conditional extreme quantiles. We examine the models called "fixed design"which corresponds to the situation where the explanatory variable is deterministic. To tackle the covariate, since it is deterministic, we use the window moving approach. Westudy the asymptotic behavior of the estimators proposed and some numerical resultsusing simulated data with the software "R".• In the third part of this thesis, we extend the work of the second part of the framemodels called "random design" for which the data are spatial observations of a pair (Y,X) of real random variables . In this last model, we propose an estimator of heavy tail-indexusing the kernel method to tackle the covariate. We use an estimator of the conditional tail index belonging to the family of the estimators introduced by Goegebeur et al. (2014b)
Bouhadjera, Feriel. "Estimation non paramétrique de la fonction de régression pour des données censurées : méthodes locale linéaire et erreur relative." Thesis, Littoral, 2020. http://www.theses.fr/2020DUNK0561.
Full textIn this thesis, we are interested in developing robust and efficient methods in the nonparametric estimation of the regression function. The model considered here is the right-hand randomly censored model which is the most used in different practical fields. First, we propose a new estimator of the regression function by the local linear method. We study its almost uniform convergence with rate. We improve the order of the bias term. Finally, we compare its performance with that of the classical kernel regression estimator using simulations. In the second step, we consider the regression function estimator, based on theminimization of the mean relative square error (called : relative regression estimator). We establish the uniform almost sure consistency with rate of the estimator defined for independent and identically distributed observations. We prove its asymptotic normality and give the explicit expression of the variance term. We conduct a simulation study to confirm our theoretical results. Finally, we have applied our estimator on real data. Then, we study the almost sure uniform convergence (on a compact set) with rate of the relative regression estimator for observations that are subject to a dependency structure of α-mixing type. A simulation study shows the good behaviour of the studied estimator. Predictions on generated data are carried out to illustrate the robustness of our estimator. Finally, we establish the asymptotic normality of the relative regression function estimator for α-mixing data. We construct the confidence intervals and perform a simulation study to validate our theoretical results. In addition to the analysis of the censored data, the common thread of this modest contribution is the proposal of two alternative prediction methods to classical regression. The first approach corrects the border effects created by classical kernel estimators and reduces the bias term. While the second is more robust and less affected by the presence of outliers in the sample
Esstafa, Youssef. "Modèles de séries temporelles à mémoire longue avec innovations dépendantes." Thesis, Bourgogne Franche-Comté, 2019. http://www.theses.fr/2019UBFCD021.
Full textWe first consider, in this thesis, the problem of statistical analysis of FARIMA (Fractionally AutoRegressive Integrated Moving-Average) models endowed with uncorrelated but non-independent error terms. These models are called weak FARIMA and can be used to fit long-memory processes with general nonlinear dynamics. Relaxing the independence assumption on the noise, which is a standard assumption usually imposed in the literature, allows weak FARIMA models to cover a large class of nonlinear long-memory processes. The weak FARIMA models are dense in the set of purely non-deterministic stationary processes, the class of these models encompasses that of FARIMA processes with an independent and identically distributed noise (iid). We call thereafter strong FARIMA models the models in which the error term is assumed to be an iid innovations.We establish procedures for estimating and validating weak FARIMA models. We show, under weak assumptions on the noise, that the least squares estimator of the parameters of weak FARIMA(p,d,q) models is strongly consistent and asymptotically normal. The asymptotic variance matrix of the least squares estimator of weak FARIMA(p,d,q) models has the "sandwich" form. This matrix can be very different from the asymptotic variance obtained in the strong case (i.e. in the case where the noise is assumed to be iid). We propose, by two different methods, a convergent estimator of this matrix. An alternative method based on a self-normalization approach is also proposed to construct confidence intervals for the parameters of weak FARIMA(p,d,q) models.We then pay particular attention to the problem of validation of weak FARIMA(p,d,q) models. We show that the residual autocorrelations have a normal asymptotic distribution with a covariance matrix different from that one obtained in the strong FARIMA case. This allows us to deduce the exact asymptotic distribution of portmanteau statistics and thus to propose modified versions of portmanteau tests. It is well known that the asymptotic distribution of portmanteau tests is correctly approximated by a chi-squared distribution when the error term is assumed to be iid. In the general case, we show that this asymptotic distribution is a mixture of chi-squared distributions. It can be very different from the usual chi-squared approximation of the strong case. We adopt the same self-normalization approach used for constructing the confidence intervals of weak FARIMA model parameters to test the adequacy of weak FARIMA(p,d,q) models. This method has the advantage of avoiding the problem of estimating the asymptotic variance matrix of the joint vector of the least squares estimator and the empirical autocovariances of the noise.Secondly, we deal in this thesis with the problem of estimating autoregressive models of order 1 endowed with fractional Gaussian noise when the Hurst parameter H is assumed to be known. We study, more precisely, the convergence and the asymptotic normality of the generalized least squares estimator of the autoregressive parameter of these models
Liao, Hung-Neng, and 廖宏能. "Evaluation of Tukey’s Control Chart to Non-normality." Thesis, 2006. http://ndltd.ncl.edu.tw/handle/28347091111825982109.
Full text國立雲林科技大學
工業工程與管理研究所碩士班
94
Lately, Alemi proposed an individual control chart-Tukey control chart at Journal of Quality Management in Health Care in 2004. The proposed control chart is easy to construct, besides has some advantage. These advantage are:(1)works well with low sample sizes, (2)need not estimate parameter,(3)is unaffected by non-normal data distributions, and(4) is not negatively impacted by the outliers. However, this control chart attract some scholars to investigate after proposed. But these research focus on the power of Tukey’s control chart when data have various levels of autocorrelation. Consequently, This research is to investigate the effect of Tukey’s control chart to non-normality. And realize properties of Tukey’s control chart.
Zheng, Bing Hong, and 鄭炳宏. "Control charts for process variability under non-normality." Thesis, 1996. http://ndltd.ncl.edu.tw/handle/87894584813549733622.
Full textZheng, Bing-Hong, and 鄭炳宏. "Control Charts for Process Variability Under Non-Normality." Thesis, 1996. http://ndltd.ncl.edu.tw/handle/69749030505398333717.
Full textShiau, Wu-Fu, and 蕭武夫. "Non-normality and the Control Chart for Median." Thesis, 1997. http://ndltd.ncl.edu.tw/handle/09868522092875983900.
Full textGuo, Hong-Ying, and 郭虹纓. "Robustness Assessment of Process Capability Indices Under Non-Normality." Thesis, 1996. http://ndltd.ncl.edu.tw/handle/70978286438328625531.
Full textCHEN, WEI-ZHI, and 陳偉智. "Design of Run Sum Control Chart Under Non-Normality." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/5c2m9j.
Full text國立雲林科技大學
工業工程與管理系
107
Owing to the rise of consumer awareness, the improvement of quality has become the primary goal of the company. The stability of the process will have a direct impact on quality. The control chart which is the most effective and widely used tool of Statistical Process Control(SPC)is often used to monitor process. The Shewhart control charts has a good performance when the process is greatly shifting, but have poor performance when the process is slightly shifting. Roberts (1966) proposed the Run Sum control chart and its performance is obviously better than the Shewhart control chart when the process have medium or small shift. Besides, it is easily and conveniently to operate. The Run Sum control chart provides a better option as controlling a process. The control charts are usually constructed under the assumption that the data is normal distribution. In reality, the measurement data of many processes does not obey the normal distribution. Chang and Bai (2001) mentioned that the distributions of measurements from chemical process, semiconductor process and the cutting tool wear process are often skewed. When the data skewness increases and the control charts established under the normal distribution monitor the skewed data, the false alarm rate of the control chart will increase. In this study, different Burr distributions are used to represent the non-normal distributions of different skewed conditions, and the asymmetric Run Sum control chart design suitable for data distribution is established in the non-normal distribution. The genetic algorithm is used to solve the optimal parameter combination of the control chart. Finally, the performance is compared with the asymmetric Shewhart control chart in the non-normal distribution. It shows that the asymmetric Run Sum control chart proposed in this study can detect the shifts of the processes more quickly in the non-normal distribution.
Kang, Fu-Sen, and 康富森. "The Research on Non-normality of individual EWMA Control Chart." Thesis, 2001. http://ndltd.ncl.edu.tw/handle/20742848115769457358.
Full text淡江大學
統計學系
89
We usually suppose the quality characteristic is normal distribution on variables control chart. Nelson(1976) discuss x-bar control chart on non-normality.Their study indicates that datas for small value of r in gamma distribution (r=0.5 and 1) will cause largeα-risk’s difference with normality. In this paper, I will discuss individual EWMA control chart on non-normality. I will use charting, Gaussian quadrature and simulations to discuss the dependence on normality of the two control chart. In this paper, we can discover that x control chart is more sensitive departures from normality than individual EWMA control chart.
Pan, Peng-Yuan, and 潘鵬元. "Application of Bootstrap Method in Process Incapability Index under Non-Normality." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/03771201419133603230.
Full text國立雲林科技大學
工業工程與管理研究所碩士班
92
One of the most important tools to control the quality is Process Capability Index Analysis. Traditionally, it has been assumed that underlying distribution of the measurements is normal. However, it is not normal in the real manufacturing process. In this paper, the distribution function of the Burr distribution and nonparametric Bootstrap method will be employed to examine confidence interval of Process Incapability Index, defined by Greenwich and Jahr-Schaffrath(1995). According to the results, kurtosis coefficient has more significantly effect than skewness coefficient. Therefore, when to estimate the Process Incapability Index in Bootstrap method should examine the effect of kurtosis of sample distribution departs from normality.
Hsu, Ya-Chen, and 徐雅甄. "Robustness of the EWMA Control Chart to Non-normality for Autocorrelated Processes." Thesis, 2003. http://ndltd.ncl.edu.tw/handle/73662462652940348438.
Full textWeng, Tzu-Ying, and 翁慈霙. "Generalized Uniform Integrability and Its Applications to Asymptotic Normality for Non-i.i.d. Sequences." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/30913090148719235013.
Full text國立彰化師範大學
數學系所
96
When one proves central limit theorems for dependent sequence of random variables, two prerequisites should be verified: they are stochastic convergence of the normalized sample second moment and the renowned Lindeberg condition. A host of references therein can be referred to Chow and Teicher (1997) and Hall and Heyde (1980). In this thesis, we'll introduce a general concept of uniform integrability and then exploit it to prove the asymptotic normality for Non-i.i.d. sequence.
Wang, Pin-Hao, and 王品皓. "The Economic Design of Average Control chart Under Non-normality and Correlated Subgroups." Thesis, 2000. http://ndltd.ncl.edu.tw/handle/15690059983602600548.
Full text國立雲林科技大學
工業工程與管理研究所
88
Since 1924 when Dr. Shewhart presented the first control chart, statistical methods provide a useful application in industrial process control. Duncan (1956) proposed the first model for determining the sample size (n), the interval between successive samples (h), and the control limits of an control chart which minimizes the average cost when a single out-of-control state (assignable cause) exists. Traditionally, when conducting the design of control charts, one usually assumes the measurements in the sample are normally distributed and independent. However, this assumption may not be tenable. If the measurements are asymmetrically distributed and correlated, the statistic will be approximately normally distributed only when the sample size n is sufficiently large and may reduce the ability that a control chart detects the assignable causes. In this paper, the economic design of chart under non-normality and correlated samples will be developed using the Burr distribution. There are three sections in this research which including, the economic statistical design of the chart using Duncan’s and Alexander’s cost model for non-normality data; the economic design of the chart using Duncan’s cost model under normality and correlated data; the economic design of the chart using Duncan’s and Alexander’s cost model for non-normality and correlated data. The results of comparison show that an increase in correlation coefficient leads to increases on both the sample size and the sampling interval, and a wider control limit under Duncan’s cost model with correlated data. An increase in correlation coefficient leads to decreases in the sample size, the sampling interval and a wider control limit under Alexander’s cost model with correlated data. The sample size is not significantly affected by non-normality under both Duncan’s and Alexander’s cost models. Increasing in skewness and kurtosis coefficient results in an increase in sampling interval; control limits are robust both on skewness and kurtosis coefficient under non-normality data. A slight effect may be observed under the consideration of non-normally correlated data.
楊俊輝. "Economic model of X-chart under non-normality and measurement errors:a sensitivity study." Thesis, 1992. http://ndltd.ncl.edu.tw/handle/11288781492404754176.
Full textJunaidi, S. Si. "Meta-analysis adjusting for heterogeneity, dependence and non-normality: a Bayesian parametric approach." Thesis, 2015. http://hdl.handle.net/1959.13/1296543.
Full textIndependence and dependence between studies in meta-analysis are assumptions which are imposed on the structure of hierarchical Bayesian meta-analytic models. Whilst independence assumes that two or more studies have no correlation in meta-analysis, dependence can occur as a result of study reports using the same data or authors (Stevens & Taylor, 2009). In this thesis, the relaxation of the assumption of independence, and/or of a normal distribution for the true study effects is investigated. A variety of statistical meta-analytic models were developed and extended in this thesis. These include the DuMouchel (DuMouchel, 1990) model and hierarchical Bayesian meta-regression (HBMR) (Jones et al., 2009) model, which assume independence within and between studies or between subgroups. Also investigated were the hierarchical Bayesian linear model (HBLM) (Stevens, 2005) and hierarchical Bayesian delta-splitting (HBDS) (Stevens & Taylor, 2009) model, which allow for dependence between studies and sub-studies, introducing dependency at the sampling and hierarchical levels. Overall the General Bayesian Linear Model (GBLM) theorems, the Gibbs sampler, the Metropolis-Hasting and the Metropolis within Gibbs algorithms were shown to be produce good estimates for specific models. The analytical forms of the joint posterior distributions of all parameters for the DuMouchel and the HBMR models were derived using the general Bayesian linear model (GBLM) theorems; with models presented in the form of matrices to which the theorems could be directly applied. The GBLM theorems were shown to be useful alternative meta-analytic approaches. The Gibbs sampler algorithm was demonstrated to be an appropriate approach for approximating the parameters of the DuMouchel model, for which sensitivity analyses were conducted by imposing different prior distributions at the study level. In contrast in the HBMR model, different prior specifications at the subgroup level were imposed. An extended GBLM theorem was used to approximate the joint posterior distribution of parameters in the HBMR, given the analytical derivation of the posterior distribution for the HBMR model can be computationally intractable due to the integration of multiple functions. The DuMouchel model and the HBMR model developed were demonstrated on a data set related to the incidence of Ewing’s sarcoma (Honoki et al., 2007) and on a study relating to exposure to certain chemicals and reproductive health (Jones et al., 2009), respectively. Consistency of results suggested that the GBLM Theorem and the Gibbs sampler algorithm were good alternative approaches to parameter estimation for the DuMouchel and HBMR models. Parameter estimates were generally not sensitive to the imposition of different prior distributions on the mean and variance for the DuMouchel model, and were close to the true values when different values were specified for the hyper-parameters for the HBMR model, indicating robust models. The HBLM and HBDS models were introduced to allow for dependency at the sampling and hierarchical levels. The Gibbs sampler and Metropolis within Gibbs algorithms were used to estimate the joint posterior distributions of all parameters for the HBLM and HBDS models, respectively. The Gibbs sampler algorithm was shown to successfully approximate the joint posterior distribution of parameters in the HBLM. The analytical form of the HBLM for the l-dependence group was derived by calculating the conditional posterior distribution of each parameter, as the distributions were in standard form. The joint posterior distribution of all parameters for the HBDS model, however, was derived using the Metropolis within Gibbs algorithm, chosen as the conditional posterior distributions of some parameters were in non-standard form. The formula for the joint posterior distribution was tested successfully on studies to assess the effects of native-language vocabulary aids on second language reading. Non-normal analogues of the independent and dependent DuMouchel and HBLM were developed in the thesis. The multivariate normal distribution at the true study effects for the DuMouchel model and the HBLM being replaced by the non-central multivariate t distribution. The joint posterior distribution of all parameters for the non-normal DuMouchel model and the non-normal HBLM were approximated using the Metropolis-Hasting algorithm due to its ability to deal with the non-standard form of conditional posterior distribution of parameters. Estimation of parameters of the non-normal models was successfully conducted using R. The Metropolis-Hasting algorithm was demonstrated to be a useful approach by which to estimate the joint posterior distribution for the hierarchical Bayesian model when a non-standard form of the joint posterior is encountered. It is shown that conducting a meta-analysis which allows for dependency and/or a non-normal distribution at the true study effects for hierarchical Bayesian models can lead to good overall conclusions.
Liou, Jia-Hueng, and 劉家宏. "Non-Normality of the Joint Economic Design of X-bar and R Control Chart." Thesis, 1999. http://ndltd.ncl.edu.tw/handle/24656584001280333879.
Full text國立雲林科技大學
工業工程與管理研究所
87
Since Duncan’s pioneering work in economically design of X-bar control chart, there were a lot of works toward economically design of different control charts. Saniga who is the first person proposed joint economically optimal design of X-bar and R control chart in 1977. In his research, the quality characteristic is assumed to be normally distributed. But there are cases to have quality characteristic that is not normally distributed in practice. In this research, the Burr distribution is used to represent the distribution of the quality characteristic which is nonnormally distributed, and Saniga’s joint economic design model is used as the basis for developing the joint economic design of X-bar and R control chart. The Genetic Algorithms procedure is employed for searching the optimal solution of those economic design parameters of X-bar and R control chart. A computer program will be developed also to help the practitioner for searching the optimal design parameters. There are two points must be considered before making use of this study, which are described in the following list. 1. The distribution of the quality characteristic of this study that must can be approximated by Burr distribution. 2. To understand the condition of the non-normal distribution of the quality characteristic in advance, and to obtain the skewness coefficient and the kurtosis coefficient of the non-normal distribution before making use of this study. 12 categories of non-normal distribution, and each category includes 81examples are presented for optimal solution in this research. This research found that if the normal model is performed but the distribution of the quality characteristic is nonnormally distributed in practice, the false alarm and the expected cost per unit of output of normal model are more then this research.
Lin, Kung-Hong, and 林昆宏. "Non-Normality of the Joint Economic Design of X-bar and S Control Chart." Thesis, 2003. http://ndltd.ncl.edu.tw/handle/56180967728341392283.
Full text國立雲林科技大學
工業工程與管理研究所碩士班
91
Traditionally, observations characteristic is assumed to be normally distributed when control chart is applied for statistic process control. If observations value is not normally distributed, the traditional methods of design about the control chart probably reduce the ability that control chart detects non-chance cause. In according to Burr distribution, Hooke and Jeeves optimal searching rule and the skill of computer simulation, this research develops the joint economic design model of X-bar and S control chart under non-normally distributed. The theme of the thesis discuss that X-bar and S control chart control average and variance about process quality in the same time with Knappenberger and Grandage’s(1969) cost model; besides, it also proposes the economic design to make the max profit on each unit. The purposes of this research are described in the following list: 1. Apply non-normal distributed to the joint of the economic design of X-bar and S control chart. 2. Develop non-normal distributed on control limit to the joint of the economic design of X-bar and S control chart. 3. Optimal solution in different (c,k) and make sensitive analysis.