Academic literature on the topic 'Statistical estimation problem'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Statistical estimation problem.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Statistical estimation problem"

1

Vogel, Annika, and Richard Ménard. "How far can the statistical error estimation problem be closed by collocated data?" Nonlinear Processes in Geophysics 30, no. 3 (2023): 375–98. http://dx.doi.org/10.5194/npg-30-375-2023.

Full text
Abstract:
Abstract. Accurate specification of the error statistics required for data assimilation remains an ongoing challenge, partly because their estimation is an underdetermined problem that requires statistical assumptions. Even with the common assumption that background and observation errors are uncorrelated, the problem remains underdetermined. One natural question that could arise is as follows: can the increasing amount of overlapping observations or other datasets help to reduce the total number of statistical assumptions, or do they introduce more statistical unknowns? In order to answer this question, this paper provides a conceptual view on the statistical error estimation problem for multiple collocated datasets, including a generalized mathematical formulation, an illustrative demonstration with synthetic data, and guidelines for setting up and solving the problem. It is demonstrated that the required number of statistical assumptions increases linearly with the number of datasets. However, the number of error statistics that can be estimated increases quadratically, allowing for an estimation of an increasing number of error cross-statistics between datasets for more than three datasets. The presented generalized estimation of full error covariance and cross-covariance matrices between datasets does not necessarily accumulate the uncertainties of assumptions among error estimations of multiple datasets.
APA, Harvard, Vancouver, ISO, and other styles
2

Pisarenko, V. F., A. A. Lyubushin, V. B. Lysenko, and T. V. Golubeva. "Statistical estimation of seismic hazard parameters: Maximum possible magnitude and related parameters." Bulletin of the Seismological Society of America 86, no. 3 (1996): 691–700. http://dx.doi.org/10.1785/bssa0860030691.

Full text
Abstract:
Abstract The problem of statistical estimation of earthquake hazard parameters is considered. The emphasis is on estimation of the maximum regional magnitude, Mmax, and the maximum magnitude, Mmax(T), in a future time interval T and quantiles of its distribution. Two estimators are suggested: an unbiased estimator with the lowest possible variance and a Bayesian estimator. As an illustration, these methods are applied for the estimation of Mmax and related parameters in California and Italy.
APA, Harvard, Vancouver, ISO, and other styles
3

Yang, Da. "Interval Estimation and Hypothesis Testing." Applied Mechanics and Materials 543-547 (March 2014): 1717–20. http://dx.doi.org/10.4028/www.scientific.net/amm.543-547.1717.

Full text
Abstract:
Mathematical statistics is a branch of mathematics has extensive application of interval estimation and hypothesis testing, which are two important problems of statistical inference. As two important statistical inference methods, interval estimation and hypothesis testing problem is more and more widely used in the field of economic management, finance and insurance, scientific research, engineering technology, the science of decision functions are recognized by more and more people. Can go further to establish mutual influence and communication between the interval estimation and hypothesis testing, can use the theory to explain the problem of interval estimation of parameter hypothesis test, this is an important problem to improve the statistical inference theory. Therefore, the basis on the internal relations between the interval estimation and hypothesis test for deep research, explain the problem of hypothesis testing and interval estimation from the point of view, discusses the difference and connection between the two.
APA, Harvard, Vancouver, ISO, and other styles
4

Adepoju, Akeem Ajibola, Akanji Olalekan Bello, Alhaji Modu Isa, Akinrefon Adesupo, and Jamiu S. Olumoh. "STATISTICAL INFERENCE ON SINE-EXPONENTIAL DISTRIBUTION PARAMETER." Journal of Computational Innovation and Analytics (JCIA) 3, no. 2 (2024): 129–45. http://dx.doi.org/10.32890/jcia2024.3.2.6.

Full text
Abstract:
The Sine-Exponential (Sine-E) distribution is a probability distribution that combines the periodic behavior of the sine function with the decay characteristic of the exponential function. This study addresses the problem of identifying the most accurate and reliable estimation method for the parameter of the Sine-E distribution. The objective is to evaluate various parameter estimation techniques, including Maximum Likelihood Estimation (MLE), Least Squares Estimation (LSE), Weighted Least Squares Estimation (WLSE), Maximum Product of Spacing Estimation (MPSE), Cramer-von-Mises Estimation (CVME), and Anderson-Darling Estimation (ADE), using Mean Square Error (MSE) as the criterion for determining the technique with the minimum error. The study’s findings reveal that as sample size increases, the parameter estimates for all techniques converge to the true parameter value, with decreases in bias, MSE, and mean relative estimates. Among the techniques evaluated, the MPSE method consistently provides estimates closest to the true parameter value and exhibits the least bias and lowest MSE across small, moderate, and large sample sizes, making it the best estimator for the Sine-E distribution.
APA, Harvard, Vancouver, ISO, and other styles
5

Rothman, Daniel H. "Nonlinear inversion, statistical mechanics, and residual statics estimation." GEOPHYSICS 50, no. 12 (1985): 2784–96. http://dx.doi.org/10.1190/1.1441899.

Full text
Abstract:
Nonlinear inverse problems are usually solved with linearized techniques that depend strongly on the accuracy of initial estimates of the model parameters. With linearization, objective functions can be minimized efficiently, but the risk of local rather than global optimization can be severe. I address the problem confronted in nonlinear inversion when no good initial guess of the model parameters can be made. The fully nonlinear approach presented is rooted in statistical mechanics. Although a large nonlinear problem might appear computationally intractable without linearization, reformulation of the same problem into smaller, interdependent parts can lead to tractable computation while preserving nonlinearities. I formulate inversion as a problem of Bayesian estimation, in which the prior probability distribution is the Gibbs distribution of statistical mechanics. Solutions are then obtained by maximizing the posterior probability of the model parameters. Optimization is performed with a Monte Carlo technique that was originally introduced to simulate the statistical mechanics of systems in equilibrium. The technique is applied to residual statics estimation when statics are unusually large and data are contaminated by noise. Poorly picked correlations (“cycle skips” or “leg jumps”) appear as local minima of the objective function, but global optimization is successfully performed. Further applications to deconvolution and velocity estimation are proposed.
APA, Harvard, Vancouver, ISO, and other styles
6

Yamane, Ikko, Hiroaki Sasaki, and Masashi Sugiyama. "Regularized Multitask Learning for Multidimensional Log-Density Gradient Estimation." Neural Computation 28, no. 7 (2016): 1388–410. http://dx.doi.org/10.1162/neco_a_00844.

Full text
Abstract:
Log-density gradient estimation is a fundamental statistical problem and possesses various practical applications such as clustering and measuring nongaussianity. A naive two-step approach of first estimating the density and then taking its log gradient is unreliable because an accurate density estimate does not necessarily lead to an accurate log-density gradient estimate. To cope with this problem, a method to directly estimate the log-density gradient without density estimation has been explored and demonstrated to work much better than the two-step method. The objective of this letter is to improve the performance of this direct method in multidimensional cases. Our idea is to regard the problem of log-density gradient estimation in each dimension as a task and apply regularized multitask learning to the direct log-density gradient estimator. We experimentally demonstrate the usefulness of the proposed multitask method in log-density gradient estimation and mode-seeking clustering.
APA, Harvard, Vancouver, ISO, and other styles
7

Haj Ahmad, Hanan, and Ehab M. Almetwally. "On Statistical Inference of Generalized Pareto Distribution with Jointly Progressive Censored Samples with Binomial Removal." Mathematical Problems in Engineering 2023 (April 21, 2023): 1–14. http://dx.doi.org/10.1155/2023/1821347.

Full text
Abstract:
A jointly censored sample is a very useful sampling technique in conducting comparative life tests of the products, its efficiency appears in permitting the selection of two samples from two manufacturing lines at the same time and conducting a life-testing experiment. This article presents estimation information of the joint generalized Pareto distributions parameters using Type-II progressive censoring scheme, which is carried out with binomial removal. The generalized Pareto distribution has many applications in different fields. We outline the problem of parameter estimation using the frequentest maximum likelihood and the Bayesian estimation methods. Furthermore, different interval estimation methods for estimating the four parameters were used: the asymptotic property of the maximum likelihood estimator, the credible confidence intervals, and the Bootstrap confidence intervals. The detailed numerical simulations have been considered to compare the performance of the proposed estimates. In addition, the applicability of the joint generalized Pareto censored model has been performed by applying a real data example.
APA, Harvard, Vancouver, ISO, and other styles
8

Ao, Ziqiao, and Jinglai Li. "Entropy Estimation via Normalizing Flow." Proceedings of the AAAI Conference on Artificial Intelligence 36, no. 9 (2022): 9990–98. http://dx.doi.org/10.1609/aaai.v36i9.21237.

Full text
Abstract:
Entropy estimation is an important problem in information theory and statistical science. Many popular entropy estimators suffer from fast growing estimation bias with respect to dimensionality, rendering them unsuitable for high dimensional problems. In this work we propose a transformbased method for high dimensional entropy estimation, which consists of the following two main ingredients. First by modifying the k-NN based entropy estimator, we propose a new estimator which enjoys small estimation bias for samples that are close to a uniform distribution. Second we design a normalizing flow based mapping that pushes samples toward a uniform distribution, and the relation between the entropy of the original samples and the transformed ones is also derived. As a result the entropy of a given set of samples is estimated by first transforming them toward a uniform distribution and then applying the proposed estimator to the transformed samples. Numerical experiments demonstrate the effectiveness of the method for high dimensional entropy estimation problems.
APA, Harvard, Vancouver, ISO, and other styles
9

Sasaki, Hiroaki, Yung-Kyun Noh, Gang Niu, and Masashi Sugiyama. "Direct Density Derivative Estimation." Neural Computation 28, no. 6 (2016): 1101–40. http://dx.doi.org/10.1162/neco_a_00835.

Full text
Abstract:
Estimating the derivatives of probability density functions is an essential step in statistical data analysis. A naive approach to estimate the derivatives is to first perform density estimation and then compute its derivatives. However, this approach can be unreliable because a good density estimator does not necessarily mean a good density derivative estimator. To cope with this problem, in this letter, we propose a novel method that directly estimates density derivatives without going through density estimation. The proposed method provides computationally efficient estimation for the derivatives of any order on multidimensional data with a hyperparameter tuning method and achieves the optimal parametric convergence rate. We further discuss an extension of the proposed method by applying regularized multitask learning and a general framework for density derivative estimation based on Bregman divergences. Applications of the proposed method to nonparametric Kullback-Leibler divergence approximation and bandwidth matrix selection in kernel density estimation are also explored.
APA, Harvard, Vancouver, ISO, and other styles
10

Sun, Qingfeng, Cuihong Chen, Hui Wang, Ningning Xu, Chao Liu, and Jixi Gao. "A Method for Assessing Background Concentrations near Sources of Strong CO2 Emissions." Atmosphere 14, no. 2 (2023): 200. http://dx.doi.org/10.3390/atmos14020200.

Full text
Abstract:
In the quantification model of emission intensity of emission sources, the estimation of the background concentration of greenhouse gases near an emission source is an important problem. The traditional method of estimating the background concentration of greenhouse gases through statistical information often results in a certain deviation. In order to solve this problem, we propose an adaptive estimation method of CO2 background concentrations near emission sources in this work, which takes full advantage of robust local regression and a Gaussian mixture model to achieve accurate estimations of greenhouse gas background concentrations. It is proved by experiments that when the measurement error is 0.2 ppm, the background concentration estimation error is only 0.08 mg/m3, and even when the measurement error is 1.2 ppm, the background concentration estimation error is less than 0.4 mg/m3. The CO2 concentration measurement data all show a good background concentration assessment effect, and the accuracy of top-down carbon emission quantification based on actual measurements should be effectively improved in the future.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Statistical estimation problem"

1

Chen, Jinbo. "Semiparametric efficient and inefficient estimation for the auxiliary outcome problem with the conditional mean model /." Thesis, Connect to this title online; UW restricted, 2002. http://hdl.handle.net/1773/9531.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Källberg, David. "Nonparametric Statistical Inference for Entropy-type Functionals." Doctoral thesis, Umeå universitet, Institutionen för matematik och matematisk statistik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-79976.

Full text
Abstract:
In this thesis, we study statistical inference for entropy, divergence, and related functionals of one or two probability distributions. Asymptotic properties of particular nonparametric estimators of such functionals are investigated. We consider estimation from both independent and dependent observations. The thesis consists of an introductory survey of the subject and some related theory and four papers (A-D). In Paper A, we consider a general class of entropy-type functionals which includes, for example, integer order Rényi entropy and certain Bregman divergences. We propose U-statistic estimators of these functionals based on the coincident or epsilon-close vector observations in the corresponding independent and identically distributed samples. We prove some asymptotic properties of the estimators such as consistency and asymptotic normality. Applications of the obtained results related to entropy maximizing distributions, stochastic databases, and image matching are discussed. In Paper B, we provide some important generalizations of the results for continuous distributions in Paper A. The consistency of the estimators is obtained under weaker density assumptions. Moreover, we introduce a class of functionals of quadratic order, including both entropy and divergence, and prove normal limit results for the corresponding estimators which are valid even for densities of low smoothness. The asymptotic properties of a divergence-based two-sample test are also derived. In Paper C, we consider estimation of the quadratic Rényi entropy and some related functionals for the marginal distribution of a stationary m-dependent sequence. We investigate asymptotic properties of the U-statistic estimators for these functionals introduced in Papers A and B when they are based on a sample from such a sequence. We prove consistency, asymptotic normality, and Poisson convergence under mild assumptions for the stationary m-dependent sequence. Applications of the results to time-series databases and entropy-based testing for dependent samples are discussed. In Paper D, we further develop the approach for estimation of quadratic functionals with m-dependent observations introduced in Paper C. We consider quadratic functionals for one or two distributions. The consistency and rate of convergence of the corresponding U-statistic estimators are obtained under weak conditions on the stationary m-dependent sequences. Additionally, we propose estimators based on incomplete U-statistics and show their consistency properties under more general assumptions.
APA, Harvard, Vancouver, ISO, and other styles
3

Mukherjee, Rajarshi. "Statistical Inference for High Dimensional Problems." Thesis, Harvard University, 2014. http://dissertations.umi.com/gsas.harvard:11516.

Full text
Abstract:
In this dissertation, we study minimax hypothesis testing in high-dimensional regression against sparse alternatives and minimax estimation of average treatment effect in an semiparametric regression with possibly large number of covariates.
APA, Harvard, Vancouver, ISO, and other styles
4

Herrick, David Richard Mark. "Wavelet methods for curve and surface estimation." Thesis, University of Bristol, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.310601.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Zhang, Bingwen. "Change-points Estimation in Statistical Inference and Machine Learning Problems." Digital WPI, 2017. https://digitalcommons.wpi.edu/etd-dissertations/344.

Full text
Abstract:
"Statistical inference plays an increasingly important role in science, finance and industry. Despite the extensive research and wide application of statistical inference, most of the efforts focus on uniform models. This thesis considers the statistical inference in models with abrupt changes instead. The task is to estimate change-points where the underlying models change. We first study low dimensional linear regression problems for which the underlying model undergoes multiple changes. Our goal is to estimate the number and locations of change-points that segment available data into different regions, and further produce sparse and interpretable models for each region. To address challenges of the existing approaches and to produce interpretable models, we propose a sparse group Lasso (SGL) based approach for linear regression problems with change-points. Then we extend our method to high dimensional nonhomogeneous linear regression models. Under certain assumptions and using a properly chosen regularization parameter, we show several desirable properties of the method. We further extend our studies to generalized linear models (GLM) and prove similar results. In practice, change-points inference usually involves high dimensional data, hence it is prone to tackle for distributed learning with feature partitioning data, which implies each machine in the cluster stores a part of the features. One bottleneck for distributed learning is communication. For this implementation concern, we design communication efficient algorithm for feature partitioning data sets to speed up not only change-points inference but also other classes of machine learning problem including Lasso, support vector machine (SVM) and logistic regression."
APA, Harvard, Vancouver, ISO, and other styles
6

Savino, Mary Edith. "Statistical learning methods for nonlinear geochemical problems." Electronic Thesis or Diss., université Paris-Saclay, 2024. http://www.theses.fr/2024UPASM032.

Full text
Abstract:
Dans le cadre de simulations numériques de systèmes géochimiques s'intégrant dans un projet de stockage profond de déchets hautement radioactifs, nous proposons dans cette thèse deux méthodes d'estimation de fonction ainsi qu'une méthode de sélection de variables dans un modèle de régression non-paramétrique multivarié.Plus précisément, dans le Chapitre 2, nous présentons une procédure d'apprentissage actif utilisant les processus Gaussiens pour approcher des fonctions inconnues ayant plusieurs variables d'entrée. Cette méthode permet à chaque itération le calcul de l'incertitude globale sur l'estimation de la fonction et donc de choisir astucieusement les points en lesquels la fonction à estimer doit être évaluée. Ceci permet de réduire considérablement le nombre d'observations nécessaire à l'obtention d'une estimation satisfaisante de la fonction sous-jacente. De ce fait, cette méthode permet de limiter les appels à un logiciel dit "solveur" d'équations de réactions géochimiques, ce qui réduit les temps de calculs.Dans le Chapitre 3, nous proposons une deuxième méthode d'estimation de fonctions non séquentielle consistant à approximer la fonction à estimer par une combinaison linéaire de B-splines et appelée GLOBER. Dans cette approche, les noeuds des B-splines pouvant être considérés comme des changements dans les dérivées de la fonction à estimer, ceux-ci sont choisis à l'aide du generalized lasso. Dans le Chapitre 4, nous introduisons une nouvelle méthode de sélection de variables dans un modèle de régression non-paramétrique multivarié, ABSORBER, pour identifier les variables dont dépend réellement la fonction inconnue considérée et réduire ainsi la complexité des systèmes géochimiques étudiés. Dans cette approche, nous considérons que la fonction à estimer peut être approximée par une combinaison linéaire de B-splines et de leurs termes d'interactions deux-à-deux. Les coefficients de chaque terme de la combinaison linéaire sont estimés en utilisant un critère des moindres carrés standard pénalisé par les normes l2 des dérivées partielles par rapport à chaque variable.Les approches considérées ont été évaluées puis validées à l'aide de simulations numériques et ont toutes été appliquées à des systèmes géochimiques plus ou moins complexes. Des comparaisons à des méthodes de l'état de l'art ont également permis de montrer de meilleures performances obtenues par nos méthodes.Dans le Chapitre 5, les méthodes d'estimation de fonctions ainsi que la méthode de sélection de variables ont été appliquées dans le cadre d'un projet européen EURAD et comparées aux méthodes d'autres équipes impliquées dans le projet. Cette application a permis de montrer la performance de nos méthodes, notamment lorsque seules les variables pertinentes sélectionnées avec ABSORBER sont considérées.Les méthodes proposées ont été implémentées dans des packages R : glober et absorber qui sont disponibles sur le CRAN (Comprehensive R Archive Network)<br>In this thesis, we propose two function estimation methods and a variable selection method in a multivariate nonparametric model as part of numerical simulations of geochemical systems, for a deep geological disposal facility of highly radioactive waste. More specifically, in Chapter 2, we present an active learning procedure using Gaussian processes to approximate unknown functions having several input variables. This method allows for the computation of the global uncertainty of the function estimation at each iteration and thus, cunningly selects the most relevant observation points at which the function to estimate has to be evaluated. Consequently, the number of observations needed to obtain a satisfactory estimation of the underlying function is reduced, limiting calls to geochemical reaction equations solvers and reducing calculation times. Additionally, in Chapter 3, we propose a non sequential function estimation method called GLOBER consisting in approximating the function to estimate by a linear combination of B-splines. In this approach, since the knots of the B-splines can be seen as changes in the derivatives of the function to estimate, they are selected using the generalized lasso. In Chapter 4, we introduce a novel variable selection method in a multivariate nonparametric model, ABSORBER, to identify the variables the unknown function really depends on, thereby simplifying the geochemical system. In this approach, we assume that the function can be approximated by a linear combination of B-splines and their pairwise interaction terms. The coefficients of each term of the linear combination are estimated using the usual least squares criterion penalized by the l2-norms of the partial derivatives with respect to each variable. The introduced approaches were evaluated and validated through numerical experiments and were all applied to geochemical systems of varying complexity. Comparisons with state-of-the-art methods demonstrated that our methods outperformed the others. In Chapter 5, the function estimation and variable selection methods were applied in the context of a European project, EURAD, and compared to methods devised by other scientific teams involved in the projet. This application highlighted the performance of our methods, particularly when only the relevant variables selected with ABSORBER were considered. The proposed methods have been implemented in R packages: glober and absorber which are available on the CRAN (Comprehensive R Archive Network)
APA, Harvard, Vancouver, ISO, and other styles
7

Depersin, Jules. "Statistical and Computational Complexities of Robust and High-Dimensional Estimation Problems." Electronic Thesis or Diss., Institut polytechnique de Paris, 2021. http://www.theses.fr/2021IPPAG009.

Full text
Abstract:
La théorie de l'apprentissage statistique vise à fournir une meilleure compréhension des propriétés statistiques des algorithmes d'apprentissage. Ces propriétés sont souvent dérivées en supposant que les données sous-jacentes sont recueillies par échantillonnage de variables aléatoires gaussiennes (ou subgaussiennes) indépendantes et identiquement distribuées. Ces propriétés peuvent donc être radicalement affectées par la présence d'erreurs grossières (également appelées "valeurs aberrantes") dans les données, et par des données à queue lourde. Nous sommes intéressés par les procédures qui ont de bonnes propriétés même lorsqu'une partie des données est corrompue et à forte queue, procédures que nous appelons extit{robusts}, que nous obtenons souvent dans cette thèse en utilisant l'heuristique Median-Of-Mean.Nous sommes particulièrement intéressés par les procédures qui sont robustes dans des configurations à haute dimension, et nous étudions (i) comment la dimensionnalité affecte les propriétés statistiques des procédures robustes, et (ii) comment la dimensionnalité affecte la complexité computationnelle des algorithmes associés. Dans l'étude des propriétés statistiques (i), nous trouvons que pour une large gamme de problèmes, la complexité statistique des problèmes et sa "robustesse" peuvent être en un sens "découplées", conduisant à des limites où le terme dépendant de la dimension est ajouté au terme dépendant de la corruption, plutôt que multiplié par celui-ci. Nous proposons des moyens de mesurer les complexités statistiques de certains problèmes dans ce cadre corrompu, en utilisant par exemple la dimension VC. Nous fournissons également des limites inférieures pour certains de ces problèmes.Dans l'étude de la complexité computationnelle de l'algorithme associé (ii), nous montrons que dans deux cas particuliers, à savoir l'estimation robuste de la moyenne par rapport à la norme euclidienne et la régression robuste, on peut relaxer les problèmes d'optimisation associés qui deviennent exponentiellement difficiles avec la dimension pour obtenir un algorithme traitable qui se comporte de manière polynomiale dans la dimension<br>Statistical learning theory aims at providing a better understanding of the statistical properties of learning algorithms. These properties are often derived assuming the underlying data are gathered by sampling independent and identically distributed gaussian (or subgaussian) random variables. These properties can thus be drastically affected by the presence of gross errors (also called "outliers") in the data, and by data being heavy-tailed. We are interested in procedures that have good properties even when part of the data is corrupted and heavy-tailed, procedures that we call extit{robusts}, that we often get in this thesis by using the Median-Of-Mean heuristic.We are especially interested in procedures that are robust in high-dimensional set-ups, and we study (i) how dimensionality affects the statistical properties of robust procedures, and (ii) how dimensionality affects the computational complexity of the associated algorithms. In the study of the statistical properties (i), we find that for a large range of problems, the statistical complexity of the problems and its "robustness" can be in a sense "decoupled", leading to bounds where the dimension-dependent term is added to the term that depends on the corruption, rather than multiplied by it. We propose ways of measuring the statistical complexities of some problems in that corrupted framework, using for instance VC-dimension. We also provide lower bounds for some of those problems.In the study of computational complexity of the associated algorithm (ii), we show that in two special cases, namely robust mean-estimation with respect to the euclidean norm and robust regression, one can relax the associated optimization problems that becomes exponentially hard with the dimension to get tractable algorithm that behaves polynomially in the dimension
APA, Harvard, Vancouver, ISO, and other styles
8

Newman, Mark A. "Some problems in the estimation of testing of percentiles." Thesis, University of Newcastle Upon Tyne, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.239260.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Fertis, Apostolos. "A robust optimization approach to statistical estimation problems by Apostolos G. Fertis." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/53270.

Full text
Abstract:
Thesis (Ph. D.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 2009.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (p. 87-91).<br>There have long been intuitive connections between robustness and regularization in statistical estimation, for example, in lasso and support vector machines. In the first part of the thesis, we formalize these connections using robust optimization. Specifically (a) We show that in classical regression, regularized estimators like lasso can be derived by applying robust optimization to the classical least squares problem. We discover the explicit connection between the size and the structure of the uncertainty set used in the robust estimator, with the coefficient and the kind of norm used in regularization. We compare the out-of-sample performance of the nominal and the robust estimators in computer generated and real data. (b) We prove that the support vector machines estimator is also a robust estimator of some nominal classification estimator (this last fact was also observed independently and simultaneously by Xu, Caramanis, and Mannor [52]). We generalize the support vector machines estimator by considering several sizes and structures for the uncertainty sets, and proving that the respective max-min optimization problems can be expressed as regularization problems. In the second part of the thesis, we turn our attention to constructing robust maximum likelihood estimators. Specifically (a) We define robust estimators for the logistic regression model, taking into consideration uncertainty in the independent variables, in the response variable, and in both. We consider several structures for the uncertainty sets, and prove that, in all cases, they lead to convex optimization problems. We provide efficient algorithms to compute the estimates in all cases.<br>(cont.) We report on the out-of-sample performance of the robust, as well as the nominal estimators in both computer generated and real data sets, and conclude that the robust estimators achieve a higher success rate. (b) We develop a robust maximum likelihood estimator for the multivariate normal distribution by considering uncertainty sets for the data used to produce it. We develop an efficient first order gradient descent method to compute the estimate and compare the efficiency of the robust estimate to the respective nominal one in computer generated data.<br>Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
10

Rau, Christian. "Curve estimation and signal discrimination in spatial problems /." View thesis entry in Australian Digital Theses Program, 2003. http://thesis.anu.edu.au/public/adt-ANU20031215.163519/index.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Statistical estimation problem"

1

Paliy, Irina. Probability theory and mathematical statistics. INFRA-M Academic Publishing LLC., 2021. http://dx.doi.org/10.12737/1065828.

Full text
Abstract:
The tutorial is an introductory course in probability theory and mathematical statistics. Elements of combinatorics, basic concepts and theorems of probability theory, discrete random variables, continuous random variables, some limit theorems, one-dimensional and two-dimensional samples, point and interval estimation of parameters of the general population, testing of statistical hypotheses, elements of queuing theory are considered. The presentation of the theoretical material is accompanied by a large number of detailed examples of problem solving. For students of technical and economic fields of study and specialties, studying under the bachelor's and specialty programs.
APA, Harvard, Vancouver, ISO, and other styles
2

Paliy, Irina, V. A. Dalinger, and B. S. Dobronec. Probability theory and mathematical statistics. INFRA-M Academic Publishing LLC., 2023. http://dx.doi.org/10.12737/1859126.

Full text
Abstract:
The textbook is an introductory course in probability theory and mathematical statistics. Elements of combinatorics, basic concepts and theorems of probability theory, discrete random variables, continuous random variables, some limit theorems, one-dimensional and two-dimensional samples, point and interval estimation of the parameters of the general population, verification of statistical hypotheses, elements of queuing theory are considered. The presentation of the theoretical material is accompanied by a large number of detailed examples of problem solving.&#x0D; For students of technical and economic areas of training and specialties, studying under bachelor's and specialty programs.
APA, Harvard, Vancouver, ISO, and other styles
3

Męczarski, Marek. Problemy odporności w bayesowskiej analizie statystycznej. Szkoła Główna Handlowa, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

P, Basu Asit, and United States. Environmental Protection Agency, eds. Some problems of "safe dose" estimation. U.S. Environmental Protection Agency, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Nekipelova, E. F. Emigration of scientists: Problems, real estimations. Centre for Science Research and Statistics, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Voronin, Evgeniy, Aleksandr Chibunichev, and Yuriy Blohinov. Reliability of solving inverse problems of analytical photogrammetry. INFRA-M Academic Publishing LLC., 2023. http://dx.doi.org/10.12737/2010462.

Full text
Abstract:
The monograph is devoted to computational aspects of photogrammetric reconstruction of narrow-angle bundles of projecting beams that existed during the survey. Methods of improving the conditionality of systems of linear equations, ensuring the convergence of iterative refinement of their roots, increasing the stability of calculations in finite precision machine arithmetic are considered. The main efforts are focused on solving the problem of establishing reliable measurement weights within the framework of the least squares method. The criteria for the reliability of the weights are determined. Algorithms have been developed for matching the initial values of the measurement weights, adjusting the weights during equalization, and identifying insignificant parameters of mathematical measurement models. A new method for evaluating the accuracy of the equalization results has been developed.&#x0D; For specialists engaged in the processing of remote sensing data of the Earth and mathematical processing of the results of heterogeneous measurements using weighted methods of statistical estimation of the parameters of functional dependencies.
APA, Harvard, Vancouver, ISO, and other styles
7

Dudewicz, Edward J. Solutions in statistics and probability. 2nd ed. American Sciences Press, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kontoghiorghes, Erricos John. Parallel algorithms for linear models: Numerical methods and estimation problems. Kluwer Academic, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Satdarova, Faina. DIFFRACTION ANALYSIS OF DEFORMED METALS: Theory, Methods, Programs. Academus Publishing, 2019. http://dx.doi.org/10.31519/monography_1598.

Full text
Abstract:
General analysis of the distribution of crystals orientation and dislocation density in the polycrystalline system is presented. &#x0D; Recovered information in diffraction of X-rays adopting is new to structure states of polycrystal. Shear phase transformations in metals — at the macroscopic and microscopic levels — become a clear process. &#x0D; Visualizing the advances is produced by program included in package delivered. Mathematical models developing, experimental design, optimal statistical estimation, simulation the system under study and evolution process on loading serves as instrumentation.&#x0D; To reduce advanced methods to research and studies problem-oriented software will promote when installed. Automation programs passed a testing in the National University of Science and Technology “MISIS” (The Russian Federation, Moscow).&#x0D; You score an advantage in theoretical and experimental research in the field of physics of metals.
APA, Harvard, Vancouver, ISO, and other styles
10

Nicoletti, Cheti. Estimating income poverty in the presence of measurement error and missing data problems. Institute for Social and Economic Research, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Statistical estimation problem"

1

Banks, H. Thomas, Marie Davidian, John R. Samuels, and Karyn L. Sutton. "An Inverse Problem Statistical Methodology Summary." In Mathematical and Statistical Estimation Approaches in Epidemiology. Springer Netherlands, 2009. http://dx.doi.org/10.1007/978-90-481-2313-1_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Ohser, Joachim, and Konrad Sandau. "Considerations About the Estimation ofthe Size Distribution in Wicksell’s Corpuscle Problem." In Statistical Physics and Spatial Statistics. Springer Berlin Heidelberg, 2000. http://dx.doi.org/10.1007/3-540-45043-2_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Brown, Lawrence D. "The Differential Inequality of a Statistical Estimation Problem." In Statistical Decision Theory and Related Topics IV. Springer New York, 1988. http://dx.doi.org/10.1007/978-1-4613-8768-8_30.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ehsanes Saleh, A. K., and Pranab Kumar Sen. "On Shrinkage and Preliminary Test M-Estimation in a Parallelism Problem." In Advances in the Statistical Sciences: Foundations of Statistical Inference. Springer Netherlands, 1987. http://dx.doi.org/10.1007/978-94-009-4788-7_10.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Adam, Loïc, and Sébastien Destercke. "Multi-dimensional Maximal Coherent Subsets Made Easy: Illustration on an Estimation Problem." In Building Bridges between Soft and Statistical Methodologies for Data Science. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-15509-3_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Nguyen, Hien Duy, Florence Forbes, Gersende Fort, and Olivier Cappé. "An Online Minorization-Maximization Algorithm." In Studies in Classification, Data Analysis, and Knowledge Organization. Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-09034-9_29.

Full text
Abstract:
AbstractModern statistical and machine learning settings often involve high data volume and data streaming, which require the development of online estimation algorithms. The online Expectation–Maximization (EM) algorithm extends the popular EM algorithm to this setting, via a stochastic approximation approach.We show that an online version of the Minorization–Maximization (MM) algorithm, which includes the online EM algorithm as a special case, can also be constructed in a similar manner. We demonstrate our approach via an application to the logistic regression problem and compare it to existing methods.
APA, Harvard, Vancouver, ISO, and other styles
7

Montesinos López, Osval Antonio, Abelardo Montesinos López, and Jose Crossa. "Bayesian Genomic Linear Regression." In Multivariate Statistical Machine Learning Methods for Genomic Prediction. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89010-0_6.

Full text
Abstract:
AbstractThe Bayesian paradigm for parameter estimation is introduced and linked to the main problem of genomic-enabled prediction to predict the trait of interest of the non-phenotyped individuals from genotypic information, environment variables, or other information (covariates). In this situation, a convenient practice is to include the individuals to be predicted in the posterior distribution to be sampled. We explained how the Bayesian Ridge regression method is derived and exemplified with data from plant breeding genomic selection. Other Bayesian methods (Bayes A, Bayes B, Bayes C, and Bayesian Lasso) were also described and exemplified for genome-based prediction. The chapter presented several examples that were implemented in the Bayesian generalized linear regression (BGLR) library for continuous response variables. The predictor under all these Bayesian methods includes main effects (of environments and genotypes) as well as interaction terms related to genotype × environment interaction.
APA, Harvard, Vancouver, ISO, and other styles
8

Montesinos López, Osval Antonio, Abelardo Montesinos López, and Jose Crossa. "Overfitting, Model Tuning, and Evaluation of Prediction Performance." In Multivariate Statistical Machine Learning Methods for Genomic Prediction. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-89010-0_4.

Full text
Abstract:
AbstractThe overfitting phenomenon happens when a statistical machine learning model learns very well about the noise as well as the signal that is present in the training data. On the other hand, an underfitted phenomenon occurs when only a few predictors are included in the statistical machine learning model that represents the complete structure of the data pattern poorly. This problem also arises when the training data set is too small and thus an underfitted model does a poor job of fitting the training data and unsatisfactorily predicts new data points. This chapter describes the importance of the trade-off between prediction accuracy and model interpretability, as well as the difference between explanatory and predictive modeling: Explanatory modeling minimizes bias, whereas predictive modeling seeks to minimize the combination of bias and estimation variance. We assess the importance and different methods of cross-validation as well as the importance and strategies of tuning that are key to the successful use of some statistical machine learning methods. We explain the most important metrics for evaluating the prediction performance for continuous, binary, categorical, and count response variables.
APA, Harvard, Vancouver, ISO, and other styles
9

Ahmad, R. "Nonparametric Statistical Signal Detection Problems." In Nonparametric Functional Estimation and Related Topics. Springer Netherlands, 1991. http://dx.doi.org/10.1007/978-94-011-3222-0_36.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Pillonetto, Gianluigi, Tianshi Chen, Alessandro Chiuso, Giuseppe De Nicolao, and Lennart Ljung. "Classical System Identification." In Regularized System Identification. Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-95860-2_2.

Full text
Abstract:
AbstractSystem identification as a field has been around since the 1950s with roots from statistical theory. A substantial body of concepts, theory, algorithms and experience has been developed since then. Indeed, there is a very extensive literature on the subject, with many text books, like [5, 8, 12]. Some main points of this “classical” field are summarized in this chapter, just pointing to the basic structure of the problem area. The problem centres around four main pillars: (1) the observed data from the system, (2) a parametrized set of candidate models, “the Model structure”, (3) an estimation method that fits the model parameters to the observed data and (4) a validation process that helps taking decisions about the choice of model structure. The crucial choice is that of the model structure. The archetypical choice for linear models is the ARX model, a linear difference equation between the system’s input and output signals. This is a universal approximator for linear systems—for sufficiently high orders of the equations, arbitrarily good descriptions of the system are obtained. For a “good” model, proper choices of structural parameters, like the equation orders, are required. An essential part of the classical theory deals with asymptotic quality measures, bias and variance, that aim at giving the best mean square error between the model and the true system. Some of this theory is reviewed in this chapter for estimation methods of the maximum likelihood character.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Statistical estimation problem"

1

Craven, Allen. "Usage Spectrum Contribution to Rotorcraft Dynamic Component Reliability." In Vertical Flight Society 80th Annual Forum & Technology Display. The Vertical Flight Society, 2024. http://dx.doi.org/10.4050/f-0080-2024-0042.

Full text
Abstract:
Rotorcraft dynamic component fatigue lives and corresponding reliability have long been derived from three major contributors: material strength, loads, and usage. This paper provides a historical perspective of the contribution of aircraft usage to overall U.S. Army rotorcraft dynamic component reliability. A quick background of how we got to a six-nines reliability requirement is first provided. Different types of usage spectra and the nuances and trade-offs of two specific usage gathering methods, pilot surveys and usage monitoring, are discussed. Finally, I describe where usage spectrum fits into fatigue life calculations and the existing reliability policy and requirements. Each OEM (e.g., Bell Helicopter, Boeing, Sikorsky) has been free to develop their own fatigue methods over the years. These differences in method can lead to vastly different results, even with the same input parameters as evidenced by a now well-known round robin problem. There is notable variability between OEM methodologies, each with viable solutions to this trivariate problem. In the interest of normalizing independent U.S. Government (USG) assessments across multiple OEM paradigms, the Army is investigating a USG method to assess the reliability contribution from usage. No new methods are presented herein, only findings of previous work. Uncited opinions herein are those of the author based on literature review, peer discussions, and experience with U.S. Army and U.S. Air Force (USAF) airworthiness processes. Reliability values in this paper are approximate, as there are elements of statistical distribution and non-statistical estimation that contribute.
APA, Harvard, Vancouver, ISO, and other styles
2

Earle, Keith A., David J. Schneider, Ali Mohammad-Djafari, Jean-François Bercher, and Pierre Bessiére. "Parameter Estimation as a Problem in Statistical Thermodynamics." In BAYESIAN INFERENCE AND MAXIMUM ENTROPY METHODS IN SCIENCE AND ENGINEERING: Proceedings of the 30th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering. AIP, 2011. http://dx.doi.org/10.1063/1.3573638.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Honjo, Yusuke, and Yu Otake. "Evaluation of Statistical Estimation Error in an Embankment Stability Problem." In Geo-Risk 2017. American Society of Civil Engineers, 2017. http://dx.doi.org/10.1061/9780784480724.047.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Weiss, Christian, and Abdelhak M. Zoubir. "DOA estimation in the presence of array imperfections: A sparse regularization parameter selection problem." In 2014 IEEE Statistical Signal Processing Workshop (SSP). IEEE, 2014. http://dx.doi.org/10.1109/ssp.2014.6884647.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Bartkowiak, Tomasz, and Roman Staniek. "Application of Order Statistics in the Evaluation of Flatness Error: Sampling Problem." In ASME 2017 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/imece2017-71295.

Full text
Abstract:
The main purpose of this initial paper is to demonstrate the application of order statistics in the estimation of form error from a CMM measurement. Nowadays, modern industry sets high standards for geometrical precision, surface texture and material properties. There are many parameters that can characterize mechanical part, out of which flatness error plays important in the assembly process and performance. Recently, due to the greater availability and price reduction, Coordinate Measurement Techniques have increased their popularity in the industry for on-line and off-line measurements as they allow automated measurements at relatively low uncertainty level. Data obtained from CMM measurements have to be processed and analyzed in order to evaluate component compliance with the required technical specification. The article presents an analysis of a minimal sample selection for the evaluation of flatness error by means of coordinate measurement. In the paper, a statistical approach was presented, assuming that, in the repetitive manufacturing process, the distribution of deviations between surface points and the reference plane is stable. Based on the known, statistical distribution, order statistics theorem was implemented to determine maximal and minimal point deviation statistics, as it played a dominant role in flatness error estimation. A brief analysis of normally distributed deviations was described in the paper. Moreover, the case study was presented for the set of the machined parts which were components of a machine tool mechanical structure. Empirical distributions were derived and minimal sample sizes were estimated for the given confidence levels using the proposed theorem. The estimation errors of flatness values for the derived sample sizes were analyzed and discussed in the paper.
APA, Harvard, Vancouver, ISO, and other styles
6

Buryachenko, Valeriy A. "Effective Behavior of Peridynamic Random Structure Composites Subjected to Body Force With Compact Support." In ASME 2022 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/imece2022-95107.

Full text
Abstract:
Abstract We consider a static problem for statistically homogeneous matrix linear bond-based peridynamic composite materials (CMs). Even for locally elastic composites subjected to inhomogeneous loading, the effective deformations are described by a nonlocal (either the differential or integral) operator relating a statistical average of stresses in the point being considered with a statistical average of strains in the vicinity of this point. The basic hypotheses of locally elastic micromechanics are generalized to their peridynamic counterparts (see for details [1]). The method is based on estimation of a perturbator introduced by one inclusion that is, in fact, the solutions of the basic problem for one inclusion inside the infinite peristatic matrix subjected to the body force. The statistical averages of both the displacements and stresses are estimated by summation of these perturbators for all possible locations of inclusions (in so doing, renormalizing procedure is not required). It allows us to estimate not only the effective nonlocal constitutive equation but also to evaluate the statistical field averages (inhomogeneous and nonlocal) inside the phases at the fine scale that is critically important for advanced modelling (e.g. for any nonlinear phenomena). In particular, in the generalized effective field method (EFM) proposed, the effective field is evaluated from self-consistent estimations by the use of closing of a corresponding integral equation in the framework of the quasi-crystalline approximation. In so doing, the classical effective field hypothesis is relaxed, and the hypothesis of the ellipsoidal symmetry of the random structure of CMs is not used. Numerical results for the estimation of effective displacements are obtained for 1D statistically homogeneous CM bar with the prescribed self equilibrated body forces.
APA, Harvard, Vancouver, ISO, and other styles
7

Shelekhov, Alexander P. "Use of the perturbation technique in the statistical estimation theory for analyzing the Doppler sounding problem." In Satellite Remote Sensing III, edited by Adam D. Devir, Anton Kohnle, and Christian Werner. SPIE, 1997. http://dx.doi.org/10.1117/12.263152.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Hougaz, Ana Beatriz, David Lima, Bernardo Peters, Patricia Cury, and Luciano Oliveira. "Sex estimation on panoramic dental radiographs: A methodological approach." In Simpósio Brasileiro de Computação Aplicada à Saúde. Sociedade Brasileira de Computação - SBC, 2023. http://dx.doi.org/10.5753/sbcas.2023.229563.

Full text
Abstract:
Estimating sex using tooth radiographs requires knowledge of a comprehensive spectrum of maxillar anatomy, which ultimately demands specialization on the anatomical structures in the oral cavity. In this paper, we propose a more effective methodological study than others present in the literature for the problem of automatic sex estimation. Our methodology uses the largest publicly available data set in the literature, raises statistical significance in the performance assessment, and explains which part of the images influences the classification. Our findings showed that although EfficientNetV2-Large reached an average F1-score of 91,43% +- 0,67, an EfficientNet-B0 could be more beneficial with a very close F1-score and a much lighter architecture.
APA, Harvard, Vancouver, ISO, and other styles
9

Mas-Soler, J., Pedro C. de Mello, Eduardo A. Tannuri, Alexandre N. Simos, and A. Souto-Iglesias. "Impact of the Uncertainties of the RAOs of a Semisubmersible Platform on the Performance of a Motion-Based Wave Inference Method." In ASME 2019 38th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/omae2019-96670.

Full text
Abstract:
Abstract Motion based wave inference allows the estimation of the directional sea spectrum from the measured motions of a vessel. Solving the resulting inverse problem is challenging as it is often ill-posed; as a matter of fact, statistical errors of the estimated platform response functions (RAOs) may lead to misleading estimations of the sea states as many noise values are severely amplified in the mathematical process. Hence, in order to obtain reliable estimations of the sea conditions some hypothesis must be included by means of regularization parameters. This work discusses how these errors affect the regularization parameters and the accuracy of the sea state estimations. For this purpose, a statistical quantification of the errors associated to the estimated transfer functions has been included in an expanded Bayesian inference approach. Then, the resulting statistical inference model has been verified by means of a comparison between the outputs of this approach and those obtained without considering the statistical errors in the Bayesian inference. The assessment of the impact on the accuracy of the estimations is based on the results of a dedicated model-scale experimental campaign, which includes more than 150 different test conditions.
APA, Harvard, Vancouver, ISO, and other styles
10

Mor, M., and A. Wolf. "Rigid-Body Motion Estimation Using Statistical Solid Dynamics Method and Dynamic Method." In ASME 2008 9th Biennial Conference on Engineering Systems Design and Analysis. ASMEDC, 2008. http://dx.doi.org/10.1115/esda2008-59365.

Full text
Abstract:
The most frequently used method in three dimensional human gait analysis involves placing markers on the skin of the analyzed segment. This introduces a significant artifact which strongly influences the bone position and orientation and joint kinematics estimates. The benefit in developing a method to reduce soft tissue artifacts is significant, resulting in the prevention, better diagnosis, and treatment of joint disorders and in the design of better prosthetic devices with longer mean times to failure. In this work we approached the problem of soft tissue artifacts from both a dynamic method and a statistical solid dynamics method. The dynamic method is based on the implementation of a Lagrangian approach to drive model based procedure for the estimation of the rigid body (bone) motion for the measurements of markers attached to the skin. The statistical solid dynamics method is a combination of several existent tools. It is based on a least squares optimization of markers position and orientation. Both methods were tested and evaluated using computer simulation and similar dynamics systems.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Statistical estimation problem"

1

Rossi, José Luiz, José Eduardo Gonçalves de Sousa, and Jose Alejandro Gutiérrez Briceño. Minding the Output Gap: A Hamilton Filter Approach and Updated Estimates for the Brazilian Economy. Inter-American Development Bank, 2023. http://dx.doi.org/10.18235/0004981.

Full text
Abstract:
This paper develops an alternative approach for estimating the potential output and the output gap, intending to serve as a good balance between a simple low data requiring method and a powerful but complex structural approach. We rely on the Hamiltons Regression Filter properties to generate a statistically robust estimator of the potential Gross Domestic Product level which overcomes the problems associated to the Hodrick-Prescott filter and improves the Production Function Approach (PFA). Furthermore, we use this methodology to update the estimates of the potential output and output gap for the Brazilian economy.
APA, Harvard, Vancouver, ISO, and other styles
2

Tosi, R., R. Codina, J. Principe, R. Rossi, and C. Soriano. D3.3 Report of ensemble based parallelism for turbulent flows and release of solvers. Scipedia, 2022. http://dx.doi.org/10.23967/exaqute.2022.3.06.

Full text
Abstract:
In this work we focus on reducing the wall clock time required to compute statistical estimators of highly chaotic incompressible flows on high performance computing systems. Our approach consists of replacing a single long-term simulation by an ensemble of multiple independent realizations, which are run in parallel with different initial conditions. A failure probability convergence criteria must be satisfied by the statistical estimator of interest to assess convergence. Its error analysis leads to the identification of two error contributions: the initialization bias and the statistical error. We propose an approach to systematically detect the burn-in time in order to minimize the initialization bias, accompanied by strategies to reduce simulation cost. The framework is validated on two very high Reynolds number obstacle problems of wind engineering interest in a high performance computing environment.
APA, Harvard, Vancouver, ISO, and other styles
3

Martner, Ricardo. Fiscal Indicators in Latin-American Countries. Inter-American Development Bank, 2005. http://dx.doi.org/10.18235/0012270.

Full text
Abstract:
The purpose of this document is to provide a comparative analysis of Latin-American government finance statistics including public expenditures, income, overall balances, and debt stocks. The paper explores some of the problems that arise in country comparisons and in regional harmonization of fiscal targets, an important issue when considering common goals of overall balances and public debt. The paper also discusses some of the new initiatives such as: applying accrual accounting and registering all variations of public net worth; including economic cycle and relative prices fluctuations in the estimation of overall balances and public debt; seeking to protect investment in strategic areas; and establishing priority areas in social expenditures.
APA, Harvard, Vancouver, ISO, and other styles
4

Nobile, F., Q. Ayoul-Guilmard, S. Ganesh, et al. D6.5 Report on stochastic optimisation for wind engineering. Scipedia, 2022. http://dx.doi.org/10.23967/exaqute.2022.3.04.

Full text
Abstract:
This report presents the latest methods of optimisation under uncertainties investigated in the ExaQUte project, and their applications to problems related to civil and wind engineering. The measure of risk throughout the report is the conditional value at risk. First, the reference method is presented: the derivation of sensitivities of the risk measure; their accurate computation; and lastly, a practical optimisation algorithm with adaptive statistical estimation. Second, this method is directly applied to a nonlinear relaxation oscillator (FitzHugh–Nagumo model) with numerical experiments to demonstrate its performance. Third, the optimisation method is adapted to the shape optimisation of an airfoil and illustrated by a large-scale experiment on a computing cluster. Finally, the benchmark of the shape optimisation of a tall building under a turbulent flow is presented, followed by an adaptation of the optimisation method. All numerical experiments showcase the open-source software stack of the ExaQUte project for large-scale computing in a distributed environment.
APA, Harvard, Vancouver, ISO, and other styles
5

Amengual, Dante, Xinyue Bei, Marine Carrasco, and Enrique Sentana. Score-type tests for normal mixtures. CIRANO, 2023. http://dx.doi.org/10.54932/uxsg1990.

Full text
Abstract:
Testing normality against discrete normal mixtures is complex because some parameters turn increasingly underidentified along alternative ways of approaching the null, others are inequality constrained, and several higher-order derivatives become identically 0. These problems make the maximum of the alternative model log-likelihood function numerically unreliable. We propose score-type tests asymptotically equivalent to the likelihood ratio as the largest of two simple intuitive statistics that only require estimation under the null. One novelty of our approach is that we treat symmetrically both ways of writing the null hypothesis without excluding any region of the parameter space. We derive the asymptotic distribution of our tests under the null and sequences of local alternatives. We also show that their asymptotic distribution is the same whether applied to observations or standardized residuals from heteroskedastic regression models. Finally, we study their power in simulations and apply them to the residuals of Mincer earnings functions.
APA, Harvard, Vancouver, ISO, and other styles
6

Flici, Farid, and Nacer-Eddine Hammouda. Mortality evolution in Algeria: What can we learn about data quality? Verlag der Österreichischen Akademie der Wissenschaften, 2021. http://dx.doi.org/10.1553/populationyearbook2021.res1.3.

Full text
Abstract:
Mortality in Algeria has declined significantly since the country declared its independence in 1962. This trend has been accompanied by improvements in data quality and changes in estimation methodology, both of which are scarcely documented, and may distort the natural evolution of mortality as reported in official statistics. In this paper, our aim is to detect these methodological and data quality changes by means of the visual inspection of mortality surfaces, which represent the evolution of mortality rates, mortality improvement rates and the male-female mortality ratio over age and time. Data quality problems are clearly visible during the 1977–1982 period. The quality of mortality data has improved after 1983, and even further since the population census of 1998, which coincided with the end of the civil war. Additional inexplicable patterns have also been detected, such as a changing mortality age pattern during the period before 1983, and a changing pattern of excess female mortality at reproductive ages, which suddenly appears in 1983 and disappears in 1992.
APA, Harvard, Vancouver, ISO, and other styles
7

Valenzuela, Patricio, and Hugo R. Ñopo. Becoming an Entrepreneur. Inter-American Development Bank, 2007. http://dx.doi.org/10.18235/0010974.

Full text
Abstract:
Using the 1996-2001 Chilean CASEN Panel Survey, this paper analyzes the impact on income of the switch from salaried employment to entrepreneurship (self-employment and leadership of micro-enterprises). By means of a difference-in-differences non-parametric matching estimator the paper alleviates problems of selection bias (on observable and unobservable traits) and creates the appropriate counterfactuals of interest. The results indicate that the income gains associated with the switch from salaried employment to entrepreneurship are positive, statistically significant and financially substantial. Even more, the results are qualitatively the same using mean and medians, suggesting that the impacts are not influenced by the presence of few superstar winners. Additionally, the income changes associated with the reverse switches (from self-employment to salaried jobs) are negative. The results also suggest interesting gender differences, as females show higher gains than males on the switch from salaried jobs to entrepreneurship and lower losses on the reverse switch.
APA, Harvard, Vancouver, ISO, and other styles
8

Searcy, Stephen W., and Kalman Peleg. Adaptive Sorting of Fresh Produce. United States Department of Agriculture, 1993. http://dx.doi.org/10.32747/1993.7568747.bard.

Full text
Abstract:
This project includes two main parts: Development of a “Selective Wavelength Imaging Sensor” and an “Adaptive Classifiery System” for adaptive imaging and sorting of agricultural products respectively. Three different technologies were investigated for building a selectable wavelength imaging sensor: diffraction gratings, tunable filters and linear variable filters. Each technology was analyzed and evaluated as the basis for implementing the adaptive sensor. Acousto optic tunable filters were found to be most suitable for the selective wavelength imaging sensor. Consequently, a selectable wavelength imaging sensor was constructed and tested using the selected technology. The sensor was tested and algorithms for multispectral image acquisition were developed. A high speed inspection system for fresh-market carrots was built and tested. It was shown that a combination of efficient parallel processing of a DSP and a PC based host CPU in conjunction with a hierarchical classification system, yielded an inspection system capable of handling 2 carrots per second with a classification accuracy of more than 90%. The adaptive sorting technique was extensively investigated and conclusively demonstrated to reduce misclassification rates in comparison to conventional non-adaptive sorting. The adaptive classifier algorithm was modeled and reduced to a series of modules that can be added to any existing produce sorting machine. A simulation of the entire process was created in Matlab using a graphical user interface technique to promote the accessibility of the difficult theoretical subjects. Typical Grade classifiers based on k-Nearest Neighbor techniques and linear discriminants were implemented. The sample histogram, estimating the cumulative distribution function (CDF), was chosen as a characterizing feature of prototype populations, whereby the Kolmogorov-Smirnov statistic was employed as a population classifier. Simulations were run on artificial data with two-dimensions, four populations and three classes. A quantitative analysis of the adaptive classifier's dependence on population separation, training set size, and stack length determined optimal values for the different parameters involved. The technique was also applied to a real produce sorting problem, e.g. an automatic machine for sorting dates by machine vision in an Israeli date packinghouse. Extensive simulations were run on actual sorting data of dates collected over a 4 month period. In all cases, the results showed a clear reduction in classification error by using the adaptive technique versus non-adaptive sorting.
APA, Harvard, Vancouver, ISO, and other styles
9

Wideman, Jr., Robert F., Nicholas B. Anthony, Avigdor Cahaner, Alan Shlosberg, Michel Bellaiche, and William B. Roush. Integrated Approach to Evaluating Inherited Predictors of Resistance to Pulmonary Hypertension Syndrome (Ascites) in Fast Growing Broiler Chickens. United States Department of Agriculture, 2000. http://dx.doi.org/10.32747/2000.7575287.bard.

Full text
Abstract:
Background PHS (pulmonary hypertension syndrome, ascites syndrome) is a serious cause of loss in the broiler industry, and is a prime example of an undesirable side effect of successful genetic development that may be deleteriously manifested by factors in the environment of growing broilers. Basically, continuous and pinpointed selection for rapid growth in broilers has led to higher oxygen demand and consequently to more frequent manifestation of an inherent potential cardiopulmonary incapability to sufficiently oxygenate the arterial blood. The multifaceted causes and modifiers of PHS make research into finding solutions to the syndrome a complex and multi threaded challenge. This research used several directions to better understand the development of PHS and to probe possible means of achieving a goal of monitoring and increasing resistance to the syndrome. Research Objectives (1) To evaluate the growth dynamics of individuals within breeding stocks and their correlation with individual susceptibility or resistance to PHS; (2) To compile data on diagnostic indices found in this work to be predictive for PHS, during exposure to experimental protocols known to trigger PHS; (3) To conduct detailed physiological evaluations of cardiopulmonary function in broilers; (4) To compile data on growth dynamics and other diagnostic indices in existing lines selected for susceptibility or resistance to PHS; (5) To integrate growth dynamics and other diagnostic data within appropriate statistical procedures to provide geneticists with predictive indices that characterize resistance or susceptibility to PHS. Revisions In the first year, the US team acquired the costly Peckode weigh platform / individual bird I.D. system that was to provide the continuous (several times each day), automated weighing of birds, for a comprehensive monitoring of growth dynamics. However, data generated were found to be inaccurate and irreproducible, so making its use implausible. Henceforth, weighing was manual, this highly labor intensive work precluding some of the original objectives of using such a strategy of growth dynamics in selection procedures involving thousands of birds. Major conclusions, solutions, achievements 1. Healthy broilers were found to have greater oscillations in growth velocity and acceleration than PHS susceptible birds. This proved the scientific validity of our original hypothesis that such differences occur. 2. Growth rate in the first week is higher in PHS-susceptible than in PHS-resistant chicks. Artificial neural network accurately distinguished differences between the two groups based on growth patterns in this period. 3. In the US, the unilateral pulmonary occlusion technique was used in collaboration with a major broiler breeding company to create a commercial broiler line that is highly resistant to PHS induced by fast growth and low ambient temperatures. 4. In Israel, lines were obtained by genetic selection on PHS mortality after cold exposure in a dam-line population comprising of 85 sire families. The wide range of PHS incidence per family (0-50%), high heritability (about 0.6), and the results in cold challenged progeny, suggested a highly effective and relatively easy means for selection for PHS resistance 5. The best minimally-invasive diagnostic indices for prediction of PHS resistance were found to be oximetry, hematocrit values, heart rate and electrocardiographic (ECG) lead II waves. Some differences in results were found between the US and Israeli teams, probably reflecting genetic differences in the broiler strains used in the two countries. For instance the US team found the S wave amplitude to predict PHS susceptibility well, whereas the Israeli team found the P wave amplitude to be a better valid predictor. 6. Comprehensive physiological studies further increased knowledge on the development of PHS cardiopulmonary characteristics of pre-ascitic birds, pulmonary arterial wedge pressures, hypotension/kidney response, pulmonary hemodynamic responses to vasoactive mediators were all examined in depth. Implications, scientific and agricultural Substantial progress has been made in understanding the genetic and environmental factors involved in PHS, and their interaction. The two teams each successfully developed different selection programs, by surgical means and by divergent selection under cold challenge. Monitoring of the progress and success of the programs was done be using the in-depth estimations that this research engendered on the reliability and value of non-invasive predictive parameters. These findings helped corroborate the validity of practical means to improve PHT resistance by research-based programs of selection.
APA, Harvard, Vancouver, ISO, and other styles
10

Peru logistics chain analysis. Population Council, 1998. http://dx.doi.org/10.31899/rh1998.1015.

Full text
Abstract:
The inventory module of situation analysis, developed by the Population Council, has been adapted to help generate indicators of logistics system functioning suggested by the Evaluation Project and to provide an analytical framework that permits more accurate estimations of the frequency, location, and patterns of logistics problems. In 1996, the module was implemented in southern Peru. In 1997, the same module was implemented in the province of Santa and two provinces of Huancavelica. Comparable information was obtained from 149 service delivery points (SDPs) in four departments. Data include inventories of contraceptive supplies and materials required for safe delivery of contraception; frequency of contraceptive stockouts in the three months prior to the survey; service and distribution statistics required for calculating number of months of stock on hand; and potential or evident problems with storage practices or conditions. In this paper, data on material stockouts is cross-tabulated with the logistical framework within which each SDP is operating, based on the hypothesis that understanding problems in logistics systems functioning requires knowledge of the context in which SDPs are resupplied. An analysis of logistics systems that incorporates the supply chain may help program managers’ better tackle logistical problems in the future.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography