Academic literature on the topic 'Estimation theory – Simulation methods'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Estimation theory – Simulation methods.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Estimation theory – Simulation methods"

1

Andersen, Torben G. "SIMULATION-BASED ECONOMETRIC METHODS." Econometric Theory 16, no. 1 (2000): 131–38. http://dx.doi.org/10.1017/s0266466600001080.

Full text
Abstract:
The accessibility of high-performance computing power has always influenced theoretical and applied econometrics. Gouriéroux and Monfort begin their recent offering, Simulation-Based Econometric Methods, with a stylized three-stage classification of the history of statistical econometrics. In the first stage, lasting through the 1960's, models and estimation methods were designed to produce closed-form expressions for the estimators. This spurred thorough investigation of the standard linear model, linear simultaneous equations with the associated instrumental variable techniques, and maximum likelihood estimation within the exponential family. During the 1970's and 1980's the development of powerful numerical optimization routines led to the exploration of procedures without closed-form solutions for the estimators. During this period the general theory of nonlinear statistical inference was developed, and nonlinear micro models such as limited dependent variable models and nonlinear time series models, e.g., ARCH, were explored. The associated estimation principles included maximum likelihood (beyond the exponential family), pseudo-maximum likelihood, nonlinear least squares, and generalized method of moments. Finally, the third stage considers problems without a tractable analytic criterion function. Such problems almost invariably arise from the need to evaluate high-dimensional integrals. The idea is to circumvent the associated numerical problems by a simulation-based approach. The main requirement is therefore that the model may be simulated given the parameters and the exogenous variables. The approach delivers simulated counterparts to standard estimation procedures and has inspired the development of entirely new procedures based on the principle of indirect inference.
APA, Harvard, Vancouver, ISO, and other styles
2

Maksimović, D. M., and V. B. Litovski. "Logic simulation methods for longest path delay estimation." IEE Proceedings - Computers and Digital Techniques 149, no. 2 (2002): 53. http://dx.doi.org/10.1049/ip-cdt:20020201.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Almongy, Hisham M., Fatma Y. Alshenawy, Ehab M. Almetwally, and Doaa A. Abdo. "Applying Transformer Insulation Using Weibull Extended Distribution Based on Progressive Censoring Scheme." Axioms 10, no. 2 (2021): 100. http://dx.doi.org/10.3390/axioms10020100.

Full text
Abstract:
In this paper, the Weibull extension distribution parameters are estimated under a progressive type-II censoring scheme with random removal. The parameters of the model are estimated using the maximum likelihood method, maximum product spacing, and Bayesian estimation methods. In classical estimation (maximum likelihood method and maximum product spacing), we did use the Newton–Raphson algorithm. The Bayesian estimation is done using the Metropolis–Hastings algorithm based on the square error loss function. The proposed estimation methods are compared using Monte Carlo simulations under a progressive type-II censoring scheme. An empirical study using a real data set of transformer insulation and a simulation study is performed to validate the introduced methods of inference. Based on the result of our study, it can be concluded that the Bayesian method outperforms the maximum likelihood and maximum product-spacing methods for estimating the Weibull extension parameters under a progressive type-II censoring scheme in both simulation and empirical studies.
APA, Harvard, Vancouver, ISO, and other styles
4

Hamad, Alaa M., and Bareq B. Salman. "Different Estimation Methods of the Stress-Strength Reliability Restricted Exponentiated Lomax Distribution." Mathematical Modelling of Engineering Problems 8, no. 3 (2021): 477–84. http://dx.doi.org/10.18280/mmep.080319.

Full text
Abstract:
Lomax distribution, a large-scale probabilistic distribution used in industry, economics, actuarial science, queue theory, and Internet traffic modeling, is the most important distribution in reliability theory. In this paper estimating the reliability of Restricted exponentiated Lomax distribution in two cases, when one component X strength and Y stress R=P(Y<X), and when system content two component series strength, Y stress by using different estimation method. such as maximum likelihood, least square and shrinkage methods. A comparison between the outcomes results of the applied methods has been carried out based on mean square error (MSE) to investigate the best method and the obtained results have been displayed via MATLAB software package.
APA, Harvard, Vancouver, ISO, and other styles
5

Liu, Weiwei, Shezhou Luo, Xiaoliang Lu, Jon Atherton, and Jean-Philippe Gastellu-Etchegorry. "Simulation-Based Evaluation of the Estimation Methods of Far-Red Solar-Induced Chlorophyll Fluorescence Escape Probability in Discontinuous Forest Canopies." Remote Sensing 12, no. 23 (2020): 3962. http://dx.doi.org/10.3390/rs12233962.

Full text
Abstract:
The escape probability of Solar-induced chlorophyll fluorescence (SIF) can be remotely estimated using reflectance measurements based on spectral invariants theory. This can then be used to correct the effects of canopy structure on canopy-leaving SIF. However, the feasibility of these estimation methods is untested in heterogeneous vegetation such as the discontinuous forest canopy layer under evaluation here. In this study, the Discrete Anisotropic Radiative Transfer (DART) model is used to simulate canopy-leaving SIF, canopy total emitted SIF, canopy interceptance, and the fraction of absorbed photosynthetically active radiation (fAPAR) in order to evaluate the estimation methods of SIF escape probability in discontinuous forest canopies. Our simulation results show that the normalized difference vegetation index (NDVI) can be used to partly eliminate the effects of background reflectance on the estimation of SIF escape probability in most cases, but fails to produce accurate estimations if the background is partly or totally covered by vegetation. We also found that SIF escape probabilities estimated at a high solar zenith angle have better estimation accuracy than those estimated at a lower solar zenith angle. Our results show that additional errors will be introduced to the estimation of SIF escape probability with the use of satellite products, especially when the product of leaf area index (LAI) and clumping index (CI) was underestimated. In other results, fAPAR has comparable estimation accuracy of SIF escape probability when compared to canopy interceptance. Additionally, fAPAR for the entire canopy has better estimation accuracy of SIF escape probability than fPAR for leaf only in sparse forest canopies. These results help us to better understand the current estimation results of SIF escape probability based on spectral invariants theory, and to improve its estimation accuracy in discontinuous forest canopies.
APA, Harvard, Vancouver, ISO, and other styles
6

Rodríguez-García, Marco A., Isaac Pérez Castillo, and P. Barberis-Blostein. "Efficient qubit phase estimation using adaptive measurements." Quantum 5 (June 4, 2021): 467. http://dx.doi.org/10.22331/q-2021-06-04-467.

Full text
Abstract:
Estimating correctly the quantum phase of a physical system is a central problem in quantum parameter estimation theory due to its wide range of applications from quantum metrology to cryptography. Ideally, the optimal quantum estimator is given by the so-called quantum Cramér-Rao bound, so any measurement strategy aims to obtain estimations as close as possible to it. However, more often than not, the current state-of-the-art methods to estimate quantum phases fail to reach this bound as they rely on maximum likelihood estimators of non-identifiable likelihood functions. In this work we thoroughly review various schemes for estimating the phase of a qubit, identifying the underlying problem which prohibits these methods to reach the quantum Cramér-Rao bound, and propose a new adaptive scheme based on covariant measurements to circumvent this problem. Our findings are carefully checked by Monte Carlo simulations, showing that the method we propose is both mathematically and experimentally more realistic and more efficient than the methods currently available.
APA, Harvard, Vancouver, ISO, and other styles
7

Schmittfull, Marcel. "Large-scale structure non-Gaussianities with modal methods." Proceedings of the International Astronomical Union 11, S308 (2014): 67–68. http://dx.doi.org/10.1017/s1743921316009649.

Full text
Abstract:
AbstractRelying on a separable modal expansion of the bispectrum, the implementation of a fast estimator for the full bispectrum of a 3d particle distribution is presented. The computational cost of accurate bispectrum estimation is negligible relative to simulation evolution, so the bispectrum can be used as a standard diagnostic whenever the power spectrum is evaluated. As an application, the time evolution of gravitational and primordial dark matter bispectra was measured in a large suite of N-body simulations. The bispectrum shape changes characteristically when the cosmic web becomes dominated by filaments and halos, therefore providing a quantitative probe of 3d structure formation. Our measured bispectra are determined by ∼ 50 coefficients, which can be used as fitting formulae in the nonlinear regime and for non-Gaussian initial conditions. We also compare the measured bispectra with predictions from the Effective Field Theory of Large Scale Structures (EFTofLSS).
APA, Harvard, Vancouver, ISO, and other styles
8

ROSSBERG, A. G. "ON THE LIMITS OF SPECTRAL METHODS FOR FREQUENCY ESTIMATION." International Journal of Bifurcation and Chaos 14, no. 06 (2004): 2115–23. http://dx.doi.org/10.1142/s0218127404010503.

Full text
Abstract:
An algorithm is presented which generates pairs of oscillatory random time series which have identical periodograms but differ in the number of oscillations. This result indicates the intrinsic limitations of spectral methods when it comes to the task of measuring frequencies. Other examples, one from medicine and one from bifurcation theory, are given, which also exhibit these limitations of spectral methods. For two methods of spectral estimation it is verified that the particular way end points are treated, which is specific to each method, is, for long enough time series, not relevant for the main result.
APA, Harvard, Vancouver, ISO, and other styles
9

M. Salah, Mukhtar, M. El-Morshedy, M. S. Eliwa, and Haitham M. Yousof. "Expanded Fréchet Model: Mathematical Properties, Copula, Different Estimation Methods, Applications and Validation Testing." Mathematics 8, no. 11 (2020): 1949. http://dx.doi.org/10.3390/math8111949.

Full text
Abstract:
The extreme value theory is expanded by proposing and studying a new version of the Fréchet model. Some new bivariate type extensions using Farlie–Gumbel–Morgenstern copula, modified Farlie–Gumbel–Morgenstern copula, Clayton copula, and Renyi’s entropy copula are derived. After a quick study for its properties, different non-Bayesian estimation methods under uncensored schemes are considered, such as the maximum likelihood estimation method, Anderson–Darling estimation method, ordinary least square estimation method, Cramér–von-Mises estimation method, weighted least square estimation method, left-tail Anderson–Darling estimation method, and right-tail Anderson–Darling estimation method. Numerical simulations were performed for comparing the estimation methods using different sample sizes for three different combinations of parameters. The Barzilai–Borwein algorithm was employed via a simulation study. Three applications were presented for measuring the flexibility and the importance of the new model for comparing the competitive distributions under the uncensored scheme. Using the approach of the Bagdonavicius–Nikulin goodness-of-fit test for validation under the right censored data, we propose a modified chi-square goodness-of-fit test for the new model. The modified goodness-of-fit statistic test was applied for the right censored real data set, called leukemia free-survival times for autologous transplants. Based on the maximum likelihood estimators on initial data, the modified goodness-of-fit test recovered the loss in information while the grouping data and followed chi-square distributions. All elements of the modified goodness-of-fit criteria tests are explicitly derived and given.
APA, Harvard, Vancouver, ISO, and other styles
10

Edwards, Julianne M., and W. Holmes Finch. "Recursive Partitioning Methods for Data Imputation in the Context of Item Response Theory: A Monte Carlo Simulation." Psicológica Journal 39, no. 1 (2018): 88–117. http://dx.doi.org/10.2478/psicolj-2018-0005.

Full text
Abstract:
AbstractMissing data is a common problem faced by psychometricians and measurement professionals. To address this issue, there are a number of techniques that have been proposed to handle missing data regarding Item Response Theory. These methods include several types of data imputation methods - corrected item mean substitution imputation, response function imputation, multiple imputation, and the EM algorithm, as well as approaches that do not rely on the imputation of missing values - treating the item as not presented, coding missing responses as incorrect, or as fractionally correct. Of these methods, even though multiple imputation has demonstrated the best performance in prior research, higher MAE was still present. Given this higher model parameter estimation MAE for even the best performing missing data methods, this simulation study’s goal was to explore the performance of a set of potentially promising data imputation methods based on recursive partitioning. Results of this study demonstrated that approaches that combine multivariate imputation by chained equations and recursive partitioning algorithms yield data with relatively low estimation MAE for both item difficulty and item discrimination. Implications of these findings are discussed.
APA, Harvard, Vancouver, ISO, and other styles
More sources

Dissertations / Theses on the topic "Estimation theory – Simulation methods"

1

Laughlin, Trevor William. "A parametric and physics-based approach to structural weight estimation of the hybrid wing body aircraft." Thesis, Georgia Institute of Technology, 2012. http://hdl.handle.net/1853/45829.

Full text
Abstract:
Estimating the structural weight of a Hybrid Wing Body (HWB) aircraft during conceptual design has proven to be a significant challenge due to its unconventional configuration. Aircraft structural weight estimation is critical during the early phases of design because inaccurate estimations could result in costly design changes or jeopardize the mission requirements and thus degrade the concept's overall viability. The tools and methods typically employed for this task are inadequate since they are derived from historical data generated by decades of tube-and-wing style construction. In addition to the limited applicability of these empirical models, the conceptual design phase requires that any new tools and methods be flexible enough to enable design space exploration without consuming a significant amount of time and computational resources. This thesis addresses these challenges by developing a parametric and physics-based modeling and simulation (M&S) environment for the purpose of HWB structural weight estimation. The tools in the M&S environment are selected based on their ability to represent the unique HWB geometry and model the physical phenomena present in the centerbody section. The new M&S environment is used to identify key design parameters that significantly contribute to the variability of the HWB centerbody structural weight and also used to generate surrogate models. These surrogate models can augment traditional aircraft sizing routines and provide improved structural weight estimations.
APA, Harvard, Vancouver, ISO, and other styles
2

Tang, Qi. "Comparison of Different Methods for Estimating Log-normal Means." Digital Commons @ East Tennessee State University, 2014. https://dc.etsu.edu/etd/2338.

Full text
Abstract:
The log-normal distribution is a popular model in many areas, especially in biostatistics and survival analysis where the data tend to be right skewed. In our research, a total of ten different estimators of log-normal means are compared theoretically. Simulations are done using different values of parameters and sample size. As a result of comparison, ``A degree of freedom adjusted" maximum likelihood estimator and Bayesian estimator under quadratic loss are the best when using the mean square error (MSE) as a criterion. The ten estimators are applied to a real dataset, an environmental study from Naval Construction Battalion Center (NCBC), Super Fund Site in Rhode Island.
APA, Harvard, Vancouver, ISO, and other styles
3

Alcantara, Adeilton Pedro de 1973. "Inferência não paramétrica baseada no método H-splines para a intensidade de processos de Poisson não-homogêneos." [s.n.], 2012. http://repositorio.unicamp.br/jspui/handle/REPOSIP/306513.

Full text
Abstract:
Orientadores: Ronaldo Dias, Nancy Lopes Garcia<br>Tese (doutorado) - Universidade Estadual de Campinas, Instituto de Matemática, Estatística e Computação Científica<br>Made available in DSpace on 2018-08-21T02:16:44Z (GMT). No. of bitstreams: 1 Alcantara_AdeiltonPedrode_D.pdf: 7403994 bytes, checksum: a1b986bd21c825efb7bc7ecbb40c550f (MD5) Previous issue date: 2012<br>Resumo: Esta tese tem por objetivo propor uma nova metodologia baseada no método da expansão por bases B-splines e suas variantes para estimação não paramétrica da função intensidade...Observação: O resumo, na íntegra, poderá ser visualizado no texto completo da tese digital<br>Abstract: The main goal of this thesis is to propose a new methodology based on the method of expansion by B-splines bases for non-parametric estimate of the intensity function...Note: The complete abstract is available with the full electronic document<br>Doutorado<br>Estatistica<br>Doutor em Estatística
APA, Harvard, Vancouver, ISO, and other styles
4

Choy, Vivian K. Y. 1971. "Estimating the inevitability of fast oscillations in model systems with two timescales." Monash University, Dept. of Mathematics and Statistics, 2001. http://arrow.monash.edu.au/hdl/1959.1/9068.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Wallman, Kaj Mikael Joakim. "Computational methods for the estimation of cardiac electrophysiological conduction parameters in a patient specific setting." Thesis, University of Oxford, 2013. http://ora.ox.ac.uk/objects/uuid:2d5573b9-5115-4434-b9c8-60f8d0531f86.

Full text
Abstract:
Cardiovascular disease is the primary cause of death globally. Although this group encompasses a heterogeneous range of conditions, many of these diseases are associated with abnormalities in the cardiac electrical propagation. In these conditions, structural abnormalities in the form of scars and fibrotic tissue are known to play an important role, leading to a high individual variability in the exact disease mechanisms. Because of this, clinical interventions such as ablation therapy and CRT that work by modifying the electrical propagation should ideally be optimized on a patient specific basis. As a tool for optimizing these interventions, computational modelling and simulation of the heart have become increasingly important. However, in order to construct these models, a crucial step is the estimation of tissue conduction properties, which have a profound impact on the cardiac activation sequence predicted by simulations. Information about the conduction properties of the cardiac tissue can be gained from electrophysiological data, obtained using electroanatomical mapping systems. However, as in other clinical modalities, electrophysiological data are often sparse and noisy, and this results in high levels of uncertainty in the estimated quantities. In this dissertation, we develop a methodology based on Bayesian inference, together with a computationally efficient model of electrical propagation to achieve two main aims: 1) to quantify values and associated uncertainty for different tissue conduction properties inferred from electroanatomical data, and 2) to design strategies to optimise the location and number of measurements required to maximise information and reduce uncertainty. The methodology is validated in several studies performed using simulated data obtained from image-based ventricular models, including realistic fibre orientation and conduction heterogeneities. Subsequently, by using the developed methodology to investigate how the uncertainty decreases in response to added measurements, we derive an a priori index for placing electrophysiological measurements in order to optimise the information content of the collected data. Results show that the derived index has a clear benefit in minimising the uncertainty of inferred conduction properties compared to a random distribution of measurements, suggesting that the methodology presented in this dissertation provides an important step towards improving the quality of the spatiotemporal information obtained using electroanatomical mapping.
APA, Harvard, Vancouver, ISO, and other styles
6

野口, 裕之, та Hiroyuki NOGUCHI. "<原著>項目困難度の分布の偏りが IRT 項目パラメタの発見的推定値に与える影響". 名古屋大学教育学部, 1992. http://hdl.handle.net/2237/3870.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Moravej, Mohammadtaghi. "Investigating Scale Effects on Analytical Methods of Predicting Peak Wind Loads on Buildings." FIU Digital Commons, 2018. https://digitalcommons.fiu.edu/etd/3799.

Full text
Abstract:
Large-scale testing of low-rise buildings or components of tall buildings is essential as it provides more representative information about the realistic wind effects than the typical small scale studies, but as the model size increases, relatively less large-scale turbulence in the upcoming flow can be generated. This results in a turbulence power spectrum lacking low-frequency turbulence content. This deficiency is known to have significant effects on the estimated peak wind loads. To overcome these limitations, the method of Partial Turbulence Simulation (PTS) has been developed recently in the FIU Wall of Wind lab to analytically compensate for the effects of the missing low-frequency content of the spectrum. This method requires post-test analysis procedures and is based on the quasi-steady assumptions. The current study was an effort to enhance that technique by investigating the effect of scaling and the range of applicability of the method by considering the limitations risen from the underlying theory, and to simplify the 2DPTS (includes both in-plane components of the turbulence) by proposing a weighted average method. Investigating the effect of Reynolds number on peak aerodynamic pressures was another objective of the study. The results from five tested building models show as the model size was increased, PTS results showed a better agreement with the available field data from TTU building. Although for the smaller models (i.e., 1:100,1:50) almost a full range of turbulence spectrum was present, the highest peaks observed at full-scale were not reproduced, which apparently was because of the Reynolds number effect. The most accurate results were obtained when the PTS was used in the case with highest Reynolds number, which was the1:6 scale model with a less than 5% blockage and a xLum/bm ratio of 0.78. Besides that, the results showed that the weighted average PTS method can be used in lieu of the 2DPTS approach. So to achieve the most accurate results, a large-scale test followed by a PTS peak estimation method deemed to be the desirable approach which also allows the xLum/bm values much smaller than the ASCE recommended numbers.
APA, Harvard, Vancouver, ISO, and other styles
8

King, David R. "A bayesian solution for the law of categorical judgment with category boundary variability and examination of robustness to model violations." Thesis, Georgia Institute of Technology, 2013. http://hdl.handle.net/1853/52960.

Full text
Abstract:
Previous solutions for the the Law of Categorical Judgment with category boundary variability have either constrained the standard deviations of the category boundaries in some way or have violated the assumptions of the scaling model. In the current work, a fully Bayesian Markov chain Monte Carlo solution for the Law of Categorical Judgment is given that estimates all model parameters (i.e. scale values, category boundaries, and the associated standard deviations). The importance of measuring category boundary standard deviations is discussed in the context of previous research in signal detection theory, which gives evidence of interindividual variability in how respondents perceive category boundaries and even intraindividual variability in how a respondent perceives category boundaries across trials. Although the measurement of category boundary standard deviations appears to be important for describing the way respondents perceive category boundaries on the latent scale, the inclusion of category boundary standard deviations in the scaling model exposes an inconsistency between the model and the rating method. Namely, with category boundary variability, the scaling model suggests that a respondent could experience disordinal category boundaries on a given trial. However, the idea that a respondent actually experiences disordinal category boundaries seems unlikely. The discrepancy between the assumptions of the scaling model and the way responses are made at the individual level indicates that the assumptions of the model will likely not be met. Therefore, the current work examined how well model parameters could be estimated when the assumptions of the model were violated in various ways as a consequence of disordinal category boundary perceptions. A parameter recovery study examined the effect of model violations on estimation accuracy by comparing estimates obtained from three response processes that violated the assumptions of the model with estimates obtained from a novel response process that did not violate the assumptions of the model. Results suggest all parameters in the Law of Categorical Judgment can be estimated reasonably well when these particular model violations occur, albeit to a lesser degree of accuracy than when the assumptions of the model are met.
APA, Harvard, Vancouver, ISO, and other styles
9

Pettersson, Tobias. "Global optimization methods for estimation of descriptive models." Thesis, Linköping University, Department of Electrical Engineering, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-11781.

Full text
Abstract:
<p>Using mathematical models with the purpose to understand and store knowlegde about a system is not a new field in science with early contributions dated back to, e.g., Kepler’s laws of planetary motion.</p><p>The aim is to obtain such a comprehensive predictive and quantitative knowledge about a phenomenon so that mathematical expressions or models can be used to forecast every relevant detail about that phenomenon. Such models can be used for reducing pollutions from car engines; prevent aviation incidents; or developing new therapeutic drugs. Models used to forecast, or predict, the behavior of a system are refered to predictive models. For such, the estimation problem aims to find one model and is well known and can be handeled by using standard methods for global nonlinear optimization.</p><p>Descriptive models are used to obtain and store quantitative knowledge of system. Estimation of descriptive models has not been much described by the literature so far; instead the methods used for predictive models have beed applied. Rather than finding one particular model, the parameter estimation for descriptive models aims to find every model that contains descriptive information about the system. Thus, the parameter estimation problem for descriptive models can not be stated as a standard optimization problem.</p><p>The main objective for this thesis is to propose methods for estimation of descriptive models. This is made by using methods for nonlinear optimization including both new and existing theory.</p>
APA, Harvard, Vancouver, ISO, and other styles
10

Davis, J. Wade. "Wavelet-based methods for estimation and discrimination /." free to MU campus, to others for purchase, 2003. http://wwwlib.umi.com/cr/mo/fullcit?p3099616.

Full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Books on the topic "Estimation theory – Simulation methods"

1

Knopov, Pavel Solomonovich. Simulation and optimization methods in risk and reliability theory. Nova Science Publishers, 2008.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Satdarova, Faina. DIFFRACTION ANALYSIS OF DEFORMED METALS: Theory, Methods, Programs. Academus Publishing, 2019. http://dx.doi.org/10.31519/monography_1598.

Full text
Abstract:
General analysis of the distribution of crystals orientation and dislocation density in the polycrystalline system is presented. &#x0D; Recovered information in diffraction of X-rays adopting is new to structure states of polycrystal. Shear phase transformations in metals — at the macroscopic and microscopic levels — become a clear process. &#x0D; Visualizing the advances is produced by program included in package delivered. Mathematical models developing, experimental design, optimal statistical estimation, simulation the system under study and evolution process on loading serves as instrumentation.&#x0D; To reduce advanced methods to research and studies problem-oriented software will promote when installed. Automation programs passed a testing in the National University of Science and Technology “MISIS” (The Russian Federation, Moscow).&#x0D; You score an advantage in theoretical and experimental research in the field of physics of metals.
APA, Harvard, Vancouver, ISO, and other styles
3

Lerch, F. J. Optimum data weighting and error calibration for estimation of gravitational parameters. National Aeronautics and Space Administration, Goddard Space Flight Center, 1989.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

Wegkamp, M. H. Entropy methods in statistical estimation. CWI, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Manski, Charles F. Analog estimation methods in econometrics. Chapman and Hall, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Horowitz, Joel L. Semiparametric methods in econometrics. Springer, 1998.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Efromovich, Sam. Nonparametric curve estimation: Methods, theory and applications. Springer, 1999.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Simulation and social theory. Sage, 2001.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Balakrishnan, N. Order statistics and inference: Estimation methods. Academic Press, 1991.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Händel, Peter. Estimation methods for narrow-band signals. Uppsala University, Dept. of Technology Systems and Control Group, 1993.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
More sources

Book chapters on the topic "Estimation theory – Simulation methods"

1

Amari, Shun-ichi. "Asymptotic Theory of Estimation." In Differential-Geometrical Methods in Statistics. Springer New York, 1985. http://dx.doi.org/10.1007/978-1-4612-5056-2_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Devroye, Luc, and Gábor Lugosi. "Minimax Theory." In Combinatorial Methods in Density Estimation. Springer New York, 2001. http://dx.doi.org/10.1007/978-1-4613-0125-7_15.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Vapnik, Vladimir N. "Methods of Function Estimation." In The Nature of Statistical Learning Theory. Springer New York, 2000. http://dx.doi.org/10.1007/978-1-4757-3264-1_7.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Golyandina, N., and V. Nekrutkin. "Estimation Errors for Functionals on Measure Spaces." In Advances in Stochastic Simulation Methods. Birkhäuser Boston, 2000. http://dx.doi.org/10.1007/978-1-4612-1318-5_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Savchuk, Vladimir, and Chris P. Tsokos. "Nonparametric Bayes Estimation." In Bayesian Theory and Methods with Applications. Atlantis Press, 2011. http://dx.doi.org/10.2991/978-94-91216-14-5_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Micheas, Athanasios Christou. "Rudimentary Models and Simulation Methods." In Theory of Stochastic Objects. Chapman and Hall/CRC, 2018. http://dx.doi.org/10.1201/9781315156705-1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Kobayashi, Kei, and Henry P. Wynn. "Computational Algebraic Methods in Efficient Estimation." In Geometric Theory of Information. Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-05317-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Berry, Stuart. "Appendix A: Queueing Theory." In Simulation Foundations, Methods and Applications. Springer International Publishing, 2017. http://dx.doi.org/10.1007/978-3-319-55417-4_16.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Bretthorst, G. Larry. "Moment Estimation using Bayesian Probability Theory." In Maximum Entropy and Bayesian Methods. Springer Netherlands, 1991. http://dx.doi.org/10.1007/978-94-011-3460-6_32.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Todling, Ricardo. "Estimation theory and atmospheric data assimilation." In Inverse Methods in Global Biogeochemical Cycles. American Geophysical Union, 2000. http://dx.doi.org/10.1029/gm114p0049.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Estimation theory – Simulation methods"

1

Ameri, Ahmed Al, Cristian Nichita, Hilal Abbood, and Ali Al Atabi. "Fast Estimation Method for Selection of Optimal Distributed Generation Size Using Kalman Filter and Graph Theory." In 2015 17th UKSim-AMSS International Conference on Modelling and Simulation (UKSim). IEEE, 2015. http://dx.doi.org/10.1109/uksim.2015.105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wada, Ryota, and Takuji Waseda. "Likelihood-Weighted Method of General Pareto Distribution for Extreme Wave Height Estimation." In ASME 2013 32nd International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2013. http://dx.doi.org/10.1115/omae2013-10792.

Full text
Abstract:
In designing ocean structures, estimating the largest wave height it may encounter over its lifetime is a critical issue, but wave observation data is often sparse in space and time. Because of the limited data available, estimation errors are inevitably large. For an economical and robust structure design, the probability density function of the extreme wave height and its confidence interval must be theoretically quantified from limited information available. Extreme values estimations have been made by finding the best fitting distribution from limited observations, and extrapolating it for the desired long period. Estimations based on frequentist method lack of generality in confidence interval estimations, especially when the data size is small. Another technique recently developed is based on Bayesian Statistics, which provides the inference of uncertainty. Previous studies use informative and non-informative priors and Markov Chain Monte Carlo (MCMC) simulation for estimation. We have developed a “Likelihood-Weighted Method (LWM)” to objectively evaluate probability density function of the extreme value. The method is based on Extreme Theory and Bayesian Statistics. Our attempt is to use the ignorant prior to relate each parameter set’s likelihood to its probability. This method is pragmatic, because the numerical implementation does not require the use of MCMC. The theoretical background and practical advantages of LWM are described. Examples from randomly produced data show the performance of this method, and application to real wave data reveals the poor estimations of previous methods that do not use the Bayesian theorem. The quantification of probability for each extreme value distribution enables the probability-weighted evaluation for inference such as maximum wave height probability density function. The new inference derived from this method is useful to change structure design methodologies of ocean structures.
APA, Harvard, Vancouver, ISO, and other styles
3

Matte, Christopher-Denny, and Tsz-Ho Kwok. "Simulation of Hyper-Elasticity by Shape Estimation." In ASME 2020 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/detc2020-22583.

Full text
Abstract:
Abstract The simulation of complex geometries and non linear deformation has been a challenge for standard simulation methods. There has traditionally been a trade off between performance and accuracy. With the popularity of additive manufacturing and the new design space it enables, the challenges are even more prevalent. Additionally multiple additive manufacturing techniques now enable the use of hyperelastic materials as raw material for fabrication, and multi-material capabilities. This allows designers more freedom, but also introduces new challenges for control and simulation of the printed parts. In this paper, a novel approach to implementing non-linear material capabilities is devised with negligible additional computational for geometry based approaches. Material curves are fitted with a polynomial expression which can determine the tangent modulus, or stiffness, of a material based on strain energy. The moduli of all elements are compared to determine relative shape factors used to establish the blended shape of an element. This process is done dynamically to update the stiffness of a material in real-time, for any number of materials, regardless of linear or non-linear material curves.
APA, Harvard, Vancouver, ISO, and other styles
4

Petricic, Martin, and Alaa E. Mansour. "Estimation of the Long Term Correlation Coefficients by Simulation." In ASME 2010 29th International Conference on Ocean, Offshore and Arctic Engineering. ASMEDC, 2010. http://dx.doi.org/10.1115/omae2010-20637.

Full text
Abstract:
This paper proposes a simulation method for obtaining the estimate of the long term correlation coefficients between different low-frequency wave-induced loads acting on a ship hull. They are essential part of the load combination procedures in design and strength evaluations. Existing theory is limited to linear time-invariant systems with weakly stationary stochastic inputs such as waves during a single sea state (short-term). The simulation treats the non-stationary wave elevations during the ship’s entire life (long-term) as a sequence of different stationary Gaussian stochastic processes. Different sea states (HS, T0, Wave Direction) are sampled, using rejection sampling, from the joint probability density functions fitted to every Marsden zone on the ship’s route. The time series of the loads are simulated from the load spectra for each sea state, including the effects of loading condition, heading, speed, seasonality and voluntary as well as involuntary speed reduction. The estimates of the correlation coefficients are then calculated from these time series. The simulation time can be significantly reduced (to the order of seconds rather than hours and days) by introducing the seasonal variations into a single voyage. It is proven that the estimate of the correlation coefficient, obtained by simulating only a single voyage, approaches the true correlation coefficient in probability as the number of simulated load values increases. The simulation method can also be used for finding the long-term exceedance probabilities of the peak values of individual loads as well as for analyzing various load combinations (linear and nonlinear). Related concepts and limitations of this method are demonstrated by an example of a containership operating between Boston, MA and Southampton, UK.
APA, Harvard, Vancouver, ISO, and other styles
5

Valenti, Mario, Z. J. Delalic, and S. Jahanian. "Estimation of Power Density in VLSI Circuits Using Monte Carlo Simulation." In ASME 1999 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1999. http://dx.doi.org/10.1115/detc99/rsafp-8847.

Full text
Abstract:
Abstract CMOS technology in the past twenty years has followed the path of device scaling for achieving density, speed, and power improvements. The advancement of lithographic techniques propelled the scaling also of the fine lines to 1 urn width and below. Since the integration density on chip level is constantly increasing there is need to study power dissipation and resulting heat propagation between circuit components. This research studies different methods to analyze the power dissipations and temperature distributions of a full CMOS adder. Through examining many methods, STEPS method was selected for determination of the power dissipation in a VLSI CMOS chip. An experiment was developed for the dynamic power dissipation since static power dissipation is negligible in case of CMOS devices. Using this method one can look at each circuit node individually as signals are propagated through the chip and determine power distribution in the form of heat. Therefore, it is possible to construct a 3-dimensional diagram of the actual distribution of the heat across the chip.
APA, Harvard, Vancouver, ISO, and other styles
6

Saputro, Dewi Retno Sari, Yuanita Kusuma Wardani, Nafisa Berliana Indah Pratiwi, and Purnami Widyaningsih. "Data simulation with Markov Chain Monte Carlo, Gibbs sampling, and Bayes (beta-binomial) methods as the parameter estimations of spatial bivariate probit regression model." In THE THIRD INTERNATIONAL CONFERENCE ON MATHEMATICS: Education, Theory and Application. AIP Publishing, 2021. http://dx.doi.org/10.1063/5.0040332.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Mitrevski, Pece, and Ilija Hristoski. "Evaluation of Business-Oriented Performance Metrics in e-Commerce using Web-based Simulation." In CARMA 2016 - 1st International Conference on Advanced Research Methods and Analytics. Universitat Politècnica València, 2016. http://dx.doi.org/10.4995/carma2016.2016.2915.

Full text
Abstract:
The Web 2.0 paradigm has radically changed the way businesses are run all around the world. Moreover, e-Commerce has overcome in daily shopping activities. For management teams, the assessment, evaluation, and forecasting of online incomes and other business-oriented performance measures have become ‘a holy grail’, the ultimate question imposing their current and future e-Commerce projects. Within the paper, we describe the development of a Web-based simulation model, suitable for their estimation, taking into account multiple operation profiles and scenarios. Specifically, we put focus on introducing specific classes of e-Customers, as well as the workload characterization of an arbitrary e-Commerce website. On the other hand, we employ and embed the principles of the system thinking approach and the system dynamics into the proposed solution. As a result, a complete simulation model has been developed, available online. The model, which includes numerous adjustable input variables, can be successfully utilized in making ‘what-if’-like insights into a plethora of business-oriented performance metrics for an arbitrary e-Commerce website. This project is, also, a great example of the power delivered by InsightMaker®, free-ofcharge Web-based software, suitable for a collaborative online development of models following the systems thinking paradigm.
APA, Harvard, Vancouver, ISO, and other styles
8

Xu, Yanwen, and Pingfeng Wang. "Sequential Sampling Based Reliability Analysis for High Dimensional Rare Events With Confidence Intervals." In ASME 2020 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/detc2020-22146.

Full text
Abstract:
Abstract Analysis of rare failure events accurately is often challenging with an affordable computational cost in many engineering applications, and this is especially true for problems with high dimensional system inputs. The extremely low probabilities of occurrences for those rare events often lead to large probability estimation errors and low computational efficiency. Thus, it is vital to develop advanced probability analysis methods that are capable of providing robust estimations of rare event probabilities with narrow confidence bounds. Generally, confidence intervals of an estimator can be established based on the central limit theorem, but one of the critical obstacles is the low computational efficiency, since the widely used Monte Carlo method often requires a large number of simulation samples to derive a reasonably narrow confidence interval. This paper develops a new probability analysis approach that can be used to derive the estimates of rare event probabilities efficiently with narrow estimation bounds simultaneously for high dimensional problems. The asymptotic behaviors of the developed estimator has also been proved theoretically without imposing strong assumptions. Further, an asymptotic confidence interval is established for the developed estimator. The presented study offers important insights into the robust estimations of the probability of occurrences for rare events. The accuracy and computational efficiency of the developed technique is assessed with numerical and engineering case studies. Case study results have demonstrated that narrow bounds can be built efficiently using the developed approach, and the true values have always been located within the estimation bounds, indicating that good estimation accuracy along with a significantly improved efficiency.
APA, Harvard, Vancouver, ISO, and other styles
9

Fujisawa, Takeharu, Makoto Tsubokura, and Hisao Tanaka. "Study on Estimation of Flow Around Marine Propeller by Large-Scale LES and Affection of Grid Resolution and Reynolds Number." In ASME 2020 39th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/omae2020-18431.

Full text
Abstract:
Abstract To develop a ship with higher propulsion performance, accurate estimation method is required. Tthe performance of ships is estimated by both experimental methods using scale models and numerical methods based on computational fluid dynamics (CFD) using Reynolds average Navier-Stokes equation (RANS) models. Experiments with scale models have limitations regarding fast execution and cost reduction due to physical constraints, and we believe it is important to improve numerical methods. Adjustment of turbulent models can improve the estimation accuracy of numerical methods, but it is necessary to continuously compare and verify the numerical and experimental results. There is a limit to the improvement of the estimation accuracy of the numerical method with RANS models. Direct numerical simulation (DNS), which can be solved without modeling the turbulence field, can acquire an accurate flow field, but it is difficult to implement because it requires a huge number of grids. In a large eddy simulation (LES) model, if the grid resolution (filter size) is fine, the limitations imposed by the model itself are reduced. LES performed with a sufficiently fine grid resolution may be able to achieve the comparable estimation accuracy as DNS without a high computational load. We considered this method to be promising for accurately estimating ship performance. This is an initial study of a ship performance estimation calculation method using large-scale LES. We performed LES calculations with up to 6.4 billion cells for propeller open water test and obtained basic knowledge about large-scale LES calculations. We carried out calculations with different grid resolutions under the same operating conditions, and with different Reynolds numbers at the same resolution to show how the grid resolution and Reynolds number affect the estimation of propeller performance and flow around the propeller.
APA, Harvard, Vancouver, ISO, and other styles
10

Marten, David, Alessandro Bianchini, Georgios Pechlivanoglou, et al. "Effects of Airfoil’s Polar Data in the Stall Region on the Estimation of Darrieus Wind Turbine Performance." In ASME Turbo Expo 2016: Turbomachinery Technical Conference and Exposition. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/gt2016-56685.

Full text
Abstract:
Interest in vertical-axis wind turbines (VAWTs) is experiencing a renaissance after most major research projects came to a standstill in the mid 90’s, in favour of conventional horizontal-axis turbines (HAWTs). Nowadays, the inherent advantages of the VAWT concept, especially in the Darrieus configuration, may outweigh their disadvantages in specific applications, like the urban context or floating platforms. To enable these concepts further, efficient, accurate, and robust aerodynamic prediction tools and design guidelines are needed for VAWTs, for which low-order simulation methods have not reached yet a maturity comparable to that of the Blade Element Momentum Theory for HAWTs’ applications. The two computationally efficient methods that are presently capable of capturing the unsteady aerodynamics of Darrieus turbines are the Double Multiple Streamtubes (DMS) Theory, based on momentum balances, and the Lifting Line Theory (LLT) coupled to a free vortex wake model. Both methods make use of tabulated lift and drag coefficients to compute the blade forces. Since the incidence angles range experienced by a VAWT blade is much wider than that of a HAWT blade, the accuracy of polars in describing the stall region and the transition towards the “thin plate like” behaviour has a large effect on simulation results. This paper will demonstrate the importance of stall and post-stall data handling in the performance estimation of Darrieus VAWTs. Using validated CFD simulations as a baseline, comparisons are provided for a blade in VAWT-like motion based on a DMS and a LLT code employing three sets of post-stall data obtained from a wind tunnel campaign, XFoil predictions extrapolated with the Viterna-Corrigan model and a combination of them. The polar extrapolation influence on quasi-steady operating conditions is shown and azimuthal variations of thrust and torque are compared for exemplary tip-speed ratios (TSRs). In addition, the major relevance of a proper dynamic stall model into both simulation methods is highlighted and discussed.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Estimation theory – Simulation methods"

1

Shanken, Jay, and Guofu Zhou. Estimating and Testing Beta Pricing Models: Alternative Methods and their Performance in Simulations. National Bureau of Economic Research, 2006. http://dx.doi.org/10.3386/w12055.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Steinmen, Jeffrey, and Craig Lammers. Real Time Estimation and Prediction using Optimistic Simulation and Control Theory Techniques. Defense Technical Information Center, 2008. http://dx.doi.org/10.21236/ada478368.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Perkins, William A., and Marshall C. Richmond. MASS2, Modular Aquatic Simulation System in Two Dimensions, Theory and Numerical Methods. Office of Scientific and Technical Information (OSTI), 2007. http://dx.doi.org/10.2172/919712.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Terzic, Vesna, and William Pasco. Novel Method for Probabilistic Evaluation of the Post-Earthquake Functionality of a Bridge. Mineta Transportation Institute, 2021. http://dx.doi.org/10.31979/mti.2021.1916.

Full text
Abstract:
While modern overpass bridges are safe against collapse, their functionality will likely be compromised in case of design-level or beyond design-level earthquake, which may generate excessive residual displacements of the bridge deck. Presently, there is no validated, quantitative approach for estimating the operational level of the bridge after an earthquake due to the difficulty of accurately simulating residual displacements. This research develops a novel method for probabilistic evaluation of the post-earthquake functionality state of the bridge; the approach is founded on an explicit evaluation of bridge residual displacements and associated traffic capacity by considering realistic traffic load scenarios. This research proposes a high-fidelity finite-element model for bridge columns, developed and calibrated using existing experimental data from the shake table tests of a full-scale bridge column. This finite-element model of the bridge column is further expanded to enable evaluation of the axial load-carrying capacity of damaged columns, which is critical for an accurate evaluation of the traffic capacity of the bridge. Existing experimental data from the crushing tests on the columns with earthquake-induced damage support this phase of the finite-element model development. To properly evaluate the bridge's post-earthquake functionality state, realistic traffic loadings representative of different bridge conditions (e.g., immediate access, emergency traffic only, closed) are applied in the proposed model following an earthquake simulation. The traffic loadings in the finite-element model consider the distribution of the vehicles on the bridge causing the largest forces in the bridge columns.
APA, Harvard, Vancouver, ISO, and other styles
5

Keyes, Thomas. Combining Novel Simulation Methods and Nucleation Theory to Uncover the Secrets of Gas Hydrates. Office of Scientific and Technical Information (OSTI), 2016. http://dx.doi.org/10.2172/1247317.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Clark, G. Angular and Linear Velocity Estimation for a Re-Entry Vehicle Using Six Distributed Accelerometers: Theory, Simulation and Feasibility. Office of Scientific and Technical Information (OSTI), 2003. http://dx.doi.org/10.2172/15004928.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Ginting, Victor. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and A Posteriori Error Estimation Methods. Office of Scientific and Technical Information (OSTI), 2014. http://dx.doi.org/10.2172/1123410.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Estep, Donald. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods. Office of Scientific and Technical Information (OSTI), 2015. http://dx.doi.org/10.2172/1227282.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography