Dissertations / Theses on the topic 'Maximum likelihood method - MMV'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Maximum likelihood method - MMV.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Costa, Sidney Tadeu Santiago. "Teoria de resposta ao item aplicada no ENEM." Universidade Federal de Goiás, 2017. http://repositorio.bc.ufg.br/tede/handle/tede/6944.
Full textApproved for entry into archive by Luciana Ferreira (lucgeral@gmail.com) on 2017-03-20T12:39:15Z (GMT) No. of bitstreams: 2 Dissertação - Sidney Tadeu Santiago Costa - 2017.pdf: 1406618 bytes, checksum: 291719e6f7eaaff496ec405e241ce518 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5)
Made available in DSpace on 2017-03-20T12:39:15Z (GMT). No. of bitstreams: 2 Dissertação - Sidney Tadeu Santiago Costa - 2017.pdf: 1406618 bytes, checksum: 291719e6f7eaaff496ec405e241ce518 (MD5) license_rdf: 0 bytes, checksum: d41d8cd98f00b204e9800998ecf8427e (MD5) Previous issue date: 2017-03-03
With the note gotten in the Exame Nacional do Ensino Médio - ENEM the students can applay the vacant in diverse public institutions of superior education and programs of the government, for example, the program Universidade para Todos(Prouni) and the Fundo de Financiamento Estudantil (Fies). The ENEM uses a methodology of correction of the objective questions called Theory of Reply to the Item - TRI, that has some aspects that are different of the Classic Theory of the Tests - TCT. The main factor that determines the result of a citizen in a avaliativo process where if uses the TCT, is the number of correct answers, while in the TRI, beyond the amount of rightnesss is basic if to analyze which answers they are correct. The objective of this work is to explain what it is the TRI and as if it applies this methodology in evaluations of wide scale. A historical boarding of the logistic models used by the TRI and the justification of the existence of each parameter will be made that composes the main equation of the modeling. To determine each parameter that composes the model of the TRI and to calculate the final note of each candidate, a procedure of called optimization will be used Method of Maximum Probability - MMV. The computational tools in the work had been software R, with packages developed for application of the TRI and the Visual programming language beginner’s all-purpose symbolic instruction code to program functions, called as macros, in electronic spread sheets.
Com a nota obtida no Exame Nacional do Ensino Médio - ENEM os estudantes podem se candidatar a vagas em diversas instituições públicas de ensino superior e programas do governo, por exemplo, o programa Universidade para Todos (Prouni) e o Fundo de Financiamento Estudantil (Fies). O ENEM utiliza uma metodologia de correção das questões objetivas denominada Teoria de Resposta ao Item - TRI, que possui vários aspectos que são diferentes da Teoria Clássica dos Testes - TCT. O principal fator que determina o resultado de um sujeito em um processo avaliativo onde se utiliza a TCT, é o número de respostas corretas, enquanto na TRI, além da quantidade de acertos é fundamental se analisar quais respostas estão corretas. O objetivo deste trabalho é explicar o que é a TRI e como se aplica essa metodologia em avaliações de larga escala. Será feita uma abordagem histórica dos modelos logísticos utilizados pela TRI e a justificativa da existência de cada parâmetro que compõe a equação principal da modelagem. Para determinar cada parâmetro que compõe o modelo da TRI e calcular a nota final de cada candidato, será utilizado um procedimento de otimização denominado Método da Máxima Verossimilhança - MMV. As ferramentas computacionais no trabalho foram o software R, com pacotes desenvolvidos para aplicação da TRI e a linguagem de programação Visual Basic para programar funções, denominadas como macros, em planilhas eletrônicas.
Al-Nashi, Hamid Rasheed. "A maximum likelihood method to estimate EEG evoked potentials /." Thesis, McGill University, 1985. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=72016.
Full textWith the model described in state-space form, a Kalman filter is constructed, and the variance of the innovation process of the response measurements is derived. A maximum likelihood solution to the EP estimation problem is then obtained via this innovation process.
Tests using simulated responses show that the method is effective in estimating the EP signal at signal-to-noise ratio as low as -6db. Other tests using real normal visual response data yield reasonably consistent EP estimates whose main components are narrower and larger than the ensemble average. In addition, the likelihood function obtained by our method can be used as a discriminant between normal and abnormal responses, and it requires smaller ensembles than other methods.
Montpellier, Pierre Robert. "The maximum likelihood method of estimating dynamic properties of structures." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk2/ftp04/mq21050.pdf.
Full textKhiabanian, Hossein. "A maximum-likelihood multi-resolution weak lensing mass reconstruction method." View abstract/electronic edition; access limited to Brown University users, 2008. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:3318339.
Full textDonmez, Ayca. "Adaptive Estimation And Hypothesis Testing Methods." Phd thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/3/12611724/index.pdf.
Full texts maximum likelihood estimators (MLEs) are commonly used. They are consistent, unbiased and efficient, at any rate for large n. In most situations, however, MLEs are elusive because of computational difficulties. To alleviate these difficulties, Tiku&rsquo
s modified maximum likelihood estimators (MMLEs) are used. They are explicit functions of sample observations and easy to compute. They are asymptotically equivalent to MLEs and, for small n, are equally efficient. Moreover, MLEs and MMLEs are numerically very close to one another. For calculating MLEs and MMLEs, the functional form of the underlying distribution has to be known. For machine data processing, however, such is not the case. Instead, what is reasonable to assume for machine data processing is that the underlying distribution is a member of a broad class of distributions. Huber assumed that the underlying distribution is long-tailed symmetric and developed the so called M-estimators. It is very desirable for an estimator to be robust and have bounded influence function. M-estimators, however, implicitly censor certain sample observations which most practitioners do not appreciate. Tiku and Surucu suggested a modification to Tiku&rsquo
s MMLEs. The new MMLEs are robust and have bounded influence functions. In fact, these new estimators are overall more efficient than M-estimators for long-tailed symmetric distributions. In this thesis, we have proposed a new modification to MMLEs. The resulting estimators are robust and have bounded influence functions. We have also shown that they can be used not only for long-tailed symmetric distributions but for skew distributions as well. We have used the proposed modification in the context of experimental design and linear regression. We have shown that the resulting estimators and the hypothesis testing procedures based on them are indeed superior to earlier such estimators and tests.
Li, Ka Lok. "A Strategy for Earthquake Catalog Relocations Using a Maximum Likelihood Method." Thesis, Uppsala universitet, Geofysik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-188826.
Full textKraay, Andrea L. (Andrea Lorraine) 1976. "Physically constrained maximum likelihood method for snapshot deficient adaptive array processing." Thesis, Massachusetts Institute of Technology, 2003. http://hdl.handle.net/1721.1/87331.
Full text"February 2003."
Includes bibliographical references (leaves 139-141).
by Andrea L. Kraay.
Elec.E.and S.M.in Electrical Engineering
Stamatakis, Alexandros. "Distributed and parallel algorithms and systems for inference of huge phylogenetic trees based on the maximum likelihood method." [S.l. : s.n.], 2004. http://deposit.ddb.de/cgi-bin/dokserv?idn=973053380.
Full textIshakova, Gulmira. "On the use of Quasi-Maximum Likelihood Estimation and Indirect Method for Stochastic Volatility models." Thesis, Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-1641.
Full textStochastic volatility models have been focus for research in recent years.
One interesting and important topic has been the estimation procedure.
For a given stochastic volatility model this project aims to compare two
methods of parameter estimation.
Li, Xiangfei. "Reliability Assessment for Complex Systems Using Multi-level, Multi-type Reliability Data and Maximum Likelihood Method." Ohio University / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1402483535.
Full textHu, Mike. "A Collapsing Method for Efficient Recovery of Optimal Edges." Thesis, University of Waterloo, 2002. http://hdl.handle.net/10012/1144.
Full textGüimil, Fernando. "Comparing the Maximum Likelihood Method and a Modified Moment Method to fit a Weibull distribution to aircraft engine failure time data." Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1997. http://handle.dtic.mil/100.2/ADA337364.
Full textGüimil, Fernando. "Comparing the Maximum Likelihood Method and a Modified Moment Method to fit a Weibull distribution to aircraft engine failure time data." Thesis, Monterey, California. Naval Postgraduate School, 1997. http://hdl.handle.net/10945/8112.
Full textThis thesis provides a comparison of the accuracies of two methods for fitting a Weibull distribution to a set of aircraft engines time between failure data. One method used is the Maximum Likelihood Method and assumes that these engine failure times are independent. The other method is a Modified Method of Moments procedure and uses the fact that if time to failure T has a Weibull distribution with scale parameter lambda and shape parameter beta, then T(beta) has an exponential distribution with scale parameter lambda(beta). The latter method makes no assumption about independent failure times. A comparison is made from times that are randomly generated with a program. The program generates times in a manner that resembles the way in which engine failures occur in the real world for an engine with three subsystems. These generated operating times between failures for the same engine are not statistically independent. This comparison was extended to real data. Although the two methods gave good fits, the Maximum Likelihood Method produced a better fit than the Modified Method of Moments. Explanations for this fact are analyzed and presented in the conclusions
Ikeda, Mitsuru, Kazuhiro Shimamoto, Takeo Ishigaki, Kazunobu Yamauchi, 充. 池田, and 一信 山内. "Statistical method in a comparative study in which the standard treatment is superior to others." Nagoya University School of Medicine, 2002. http://hdl.handle.net/2237/5385.
Full textHehn, Lukas [Verfasser], and J. [Akademischer Betreuer] Blümer. "Search for dark matter with EDELWEISS-III using a multidimensional maximum likelihood method / Lukas Hehn ; Betreuer: J. Blümer." Karlsruhe : KIT-Bibliothek, 2016. http://d-nb.info/1114273473/34.
Full textSmith, Gary Douglas. "Measurements of spin asymmetries for deeply virtual compton scattering off the proton using the extended maximum likelihood method." Thesis, University of Glasgow, 2013. http://theses.gla.ac.uk/5042/.
Full textHe, Bin. "APPLICATION OF THE EMPIRICAL LIKELIHOOD METHOD IN PROPORTIONAL HAZARDS MODEL." Doctoral diss., University of Central Florida, 2006. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/4384.
Full textPh.D.
Department of Mathematics
Sciences
Mathematics
Papp, Joseph C. "Physically constrained maximum likelihood (PCML) mode filtering and its application as a pre-processing method for underwater acoustic communication." Thesis, Massachusetts Institute of Technology, 2009. http://hdl.handle.net/1721.1/54649.
Full textCataloged from PDF version of thesis.
Includes bibliographical references (p. 85-87).
Mode filtering is most commonly implemented using the sampled mode shape or pseudoinverse algorithms. Buck et al [1] placed these techniques in the context of a broader maximum a posteriori (MAP) framework. However, the MAP algorithm requires that the signal and noise statistics be known a priori. Adaptive array processing algorithms are candidates for improving performance without the need for a priori signal and noise statistics. A variant of the physically constrained, maximum likelihood (PCML) algorithm [2] is developed for mode filtering that achieves the same performance as the MAP mode filter yet does not need a priori knowledge of the signal and noise statistics. The central innovation of this adaptive mode filter is that the received signal's sample covariance matrix, as estimated by the algorithm, is constrained to be that which can be physically realized given a modal propagation model and an appropriate noise model. The first simulation presented in this thesis models the acoustic pressure field as a complex Gaussian random vector and compares the performance of the pseudoinverse, reduced rank pseudoinverse, sampled mode shape, PCML minimum power distortionless response (MPDR), PCML-MAP, and MAP mode filters. The PCML-MAP filter performs as well as the MAP filter without the need for a priori data statistics. The PCML-MPDR filter performs nearly as well as the MAP filter as well, and avoids a sawtooth pattern that occurs with the reduced rank pseudoinverse filter. The second simulation presented models the underwater environment and broadband communication setup of the Shallow Water 2006 (SW06) experiment.
(cont.) Data processing results are presented from the Shallow Water 2006 experiment, showing the reduced sensitivity of the PCML-MPDR filter to white noise compared with the reduced rank pseudoinverse filter. Lastly, a linear, decision-directed, RLS equalizer is used to combine the response of several modes and its performance is compared with an equalizer applied directly to the data received on each hydrophone.
by Joseph C. Papp.
S.M.
Luo, Li [Verfasser], Florian [Akademischer Betreuer] Jarre, and Björn [Akademischer Betreuer] Scheuermann. "Beating the Clock - An Offline Clock Synchronization Method Inspired by Maximum Likelihood Techniques / Li Luo. Gutachter: Florian Jarre ; Björn Scheuermann." Düsseldorf : Universitäts- und Landesbibliothek der Heinrich-Heine-Universität Düsseldorf, 2014. http://d-nb.info/1064379818/34.
Full textLuo, Li Verfasser], Florian [Akademischer Betreuer] [Jarre, and Björn [Akademischer Betreuer] Scheuermann. "Beating the Clock - An Offline Clock Synchronization Method Inspired by Maximum Likelihood Techniques / Li Luo. Gutachter: Florian Jarre ; Björn Scheuermann." Düsseldorf : Universitäts- und Landesbibliothek der Heinrich-Heine-Universität Düsseldorf, 2014. http://nbn-resolving.de/urn:nbn:de:hbz:061-20141218-144630-9.
Full textOrozco, M. Catalina (Maria Catalina). "Inversion Method for Spectral Analysis of Surface Waves (SASW)." Diss., Georgia Institute of Technology, 2004. http://hdl.handle.net/1853/5124.
Full textTarawneh, Monther. "A Novel Quartet-Based Method for Inferring Evolutionary Trees from Molecular Data." University of Sydney, 2008. http://hdl.handle.net/2123/2301.
Full textMolecular Evolution is the key to explain the divergence of species and the origin of life on earth. The main task in the study of molecular evolution is the reconstruction of evolutionary trees from sequences data of the current species. This thesis introduces a novel algorithm for inferring evolutionary trees from genetic data using quartet-based approach. The new method recursively merges sub-trees based on a global statistical provided by the global quartet weight matrix. The quarte weights can be computed using several methods. Since the quartet weights computation is the most expensive procedure in this approach, the new method enables the parallel inference of large evolutionary trees. Several techniques developed to deal with quartets inaccuracies. In addition, the new method we developed is flexible in such a way that can combine morphological and molecular phylogenetic analyses to yield more accurate trees. Also, we introduce the concept of critical point where more than one possible merges are possible for the same sub-tree. The critical point concept can provide information about the relationships between species in more details and show how close they are. This enables us to detect other reasonable trees. We evaluated the algorithm on both synthetic and real data sets. Experimental results showed that the new method achieved significantly better accuracy in comparison with existing methods.
Ginos, Brenda Faith. "Parameter Estimation for the Lognormal Distribution." Diss., CLICK HERE for online access, 2009. http://contentdm.lib.byu.edu/ETD/image/etd3205.pdf.
Full textOwen, Claire Elayne Bangerter. "Parameter Estimation for the Beta Distribution." Diss., CLICK HERE for online access, 2008. http://contentdm.lib.byu.edu/ETD/image/etd2670.pdf.
Full textTaylor, Simon. "Dalitz Plot Analysis of η'→ηπ+π-." Thesis, Uppsala universitet, Kärnfysik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-421196.
Full textHattaway, James T. "Parameter Estimation and Hypothesis Testing for the Truncated Normal Distribution with Applications to Introductory Statistics Grades." Diss., CLICK HERE for online access, 2010. http://contentdm.lib.byu.edu/ETD/image/etd3412.pdf.
Full textSarlak, Nermin. "Evaluation And Modeling Of Streamflow Data: Entropy Method, Autoregressive Models With Asymmetric Innovations And Artificial Neural Networks." Phd thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/3/12606135/index.pdf.
Full textCloyd, James Dale. "Data mining with Newton's method." [Johnson City, Tenn. : East Tennessee State University], 2002. http://etd-submit.etsu.edu/etd/theses/available/etd-1101102-081311/unrestricted/CloydJ111302a.pdf.
Full textMtotywa, Busisiwe Percelia, and G. J. Lyman. "A systems engineering approach to metallurgical accounting of integrated smelter complexes." Thesis, Stellenbosch : Stellenbosch University, 2008. http://hdl.handle.net/10019.1/4846.
Full textENGLISH ABSTRACT: The growing need to improve accounting accuracy, precision and to standardise generally accepted measurement methods in the mining and processing industries has led to the joining of a number of organisations under the AMIRA International umbrella, with the purpose of fulfilling these objectives. As part of this venture, Anglo Platinum undertook a project on the material balancing around its largest smelter, the Waterval Smelter. The primary objective of the project was to perform a statistical material balance around the Waterval Smelter using the Maximum Likelihood method with respect to platinum, rhodium, nickel, sulphur and chrome (III) oxide. Pt, Rh and Ni were selected for their significant contribution to the company’s profit margin, whilst S was included because of its environmental importance. Cr2O3 was included for its importance in as far as the difficulties its presence poses in smelting of PGMs. The objective was achieved by performing a series of statistical computations. These include; quantification of total and analytical uncertainties, detection of outliers, estimation and modelling of daily and monthly measurement uncertainties, parameter estimation and data reconciliation. Comparisons were made between the Maximum Likelihood and Least Squares methods. Total uncertainties associated with the daily grades were determined by use of variographic studies. The estimated Pt standard deviations were within 10% relative to the respective average grades with a few exceptions. The total uncertainties were split into their respective components by determining analytical variances from analytical replicates. The results indicated that the sampling components of the total uncertainty were generally larger as compared to their analytical counterparts. WCM, the platinum rich Waterval smelter product, has an uncertainty that is worth ~R2 103 000 in its daily Pt grade. This estimated figure shows that the quality of measurements do not only affect the accuracy of metal accounting, but can have considerable implications if not quantified and managed. The daily uncertainties were estimated using Kriging and bootstrapped to obtain estimates for the monthly uncertainties. Distributions were fitted using MLE on the distribution fitting tool of the JMP6.0 programme and goodness of fit tests were performed. The data were fitted with normal and beta distributions, and there was a notable decrease in the skewness from the daily to the monthly data. The reconciliation of the data was performed using the Maximum Likelihood and comparing that with the widely used Least Squares. The Maximum Likelihood and Least Squares adjustments were performed on simulated data in order to conduct a test of accuracy and to determine the extent of error reduction after the reconciliation exercise. The test showed that the two methods had comparable accuracies and error reduction capabilities. However, it was shown that modelling of uncertainties with the unbounded normal distribution does lead to the estimation of adjustments so large that negative adjusted values are the result. The benefit of modelling the uncertainties with a bounded distribution, which is the beta distribution in this case, is that the possibility of obtaining negative adjusted values is annihilated. ML-adjusted values (beta) will always be non-negative, therefore feasible. In a further comparison of the ML(bounded model) and the LS methods in the material balancing of the Waterval smelter complex, it was found that for all those streams whose uncertainties were modelled with a beta distribution, i.e. those whose distribution possessed some degree of skewness, the ML adjustments were significantly smaller than the LS counterparts It is therefore concluded that the Maximum Likelihood (bounded models) is a rigorous alternative method of data reconciliation to the LS method with the benefits of; -- Better estimates due to the fact that the nature of the data (distribution) is not assumed, but determined through distribution fitting and parameter estimation -- Adjusted values can never be negative due to the bounded nature of the distribution The novel contributions made in this thesis are as follows; -- The Maximum Likelihood method was for the first time employed in the material balancing of non-normally distributed data and compared with the well-known Least Squares method -- This was an original integration of geostatistical methods with data reconciliation to quantify and predict measurement uncertainties. -- For the first time, measurement uncertainties were modeled with a distribution that was non-normal and bounded in nature, leading to smaller adjustments
AFRIKAANSE OPSOMMING: Die groeiende behoefte aan rekeningkundige akkuraatheid, en om presisie te verbeter, en te standardiseer op algemeen aanvaarde meetmetodes in die mynbou en prosesseringsnywerhede, het gelei tot die samwewerking van 'n aantal van organisasies onder die AMIRA International sambreel, met die doel om bogenoemde behoeftes aan te spreek. As deel van hierdie onderneming, het Anglo Platinum onderneem om 'n projek op die materiaal balansering rondom sy grootste smelter, die Waterval smelter. Die primêre doel van die projek was om 'n statistiese materiaal balans rondom die Waterval smelter uit te voer deur gebruik te maak van die sogenaamde maksimum waarskynlikheid metode met betrekking tot platinum, rodium, nikkel, swawel en chroom (iii) oxied. Pt, Rh en Ni was gekies vir hul beduidende bydrae tot die maatskappy se winsmarge, terwyl S ingesluit was weens sy belangrike omgewingsimpak. Cr2O3 was ingesluit weens sy impak op die smelting van Platinum groep minerale. Die doelstelling was bereik deur die uitvoering van 'n reeks van statistiese berekeninge. Hierdie sluit in: die kwantifisering van die totale en analitiese variansies, opsporing van uitskieters, beraming en modellering van daaglikse en maandelikse metingsvariansies, parameter beraming en data rekonsiliasie. Vergelykings was getref tussen die maksimum waarskynlikheid en kleinste kwadrate metodes. Totale onsekerhede of variansies geassosieer met die daaglikse grade was bepaal deur ’n Variografiese studie. Die beraamde Pt standaard afwykings was binne 10% relatief tot die onderskeie gemiddelde grade met sommige uitsonderings. Die totale onsekerhede was onderverdeel in hul onderskeie komponente deur bepaling van die ontledingsvariansies van duplikate. Die uitslae toon dat die monsternemings komponente van die totale onsekerheid oor die algemeen groter was as hul bypassende analitiese variansies. WCM, ‘n platinum-ryke Waterval Smelter produk, het 'n onsekerheid in die orde van ~twee miljoen rand in sy daagliks Pt graad. Hierdie beraamde waarde toon dat die kwaliteit van metings nie alleen die akkuraatheid van metaal rekeningkunde affekteer nie, maar aansienlike finansiële implikasies het indien nie die nie gekwantifiseer en bestuur word nie. Die daagliks onsekerhede was beraam deur gebruik te maak van “Kriging” en “Bootstrap” metodes om die maandelikse onsekerhede te beraam. Verspreidings was gepas deur gebruik te maak van hoogste waarskynlikheid beraming passings en goedheid–van-pas toetse was uitgevoer. Die data was gepas met Normaal en Beta verspreidings, en daar was 'n opmerklike vermindering in die skeefheid van die daaglikse tot die maandeliks data. Die rekonsiliasies van die massabalans data was uitgevoer deur die gebruik die maksimum waarskynlikheid metodes en vergelyk daardie met die algemeen gebruikde kleinste kwadrate metode. Die maksimum waarskynlikheid (ML) en kleinste kwadrate (LS) aanpassings was uitgevoer op gesimuleerde data ten einde die akkuraatheid te toets en om die mate van fout vermindering na die rekonsiliasie te bepaal. Die toets getoon dat die twee metodes het vergelykbare akkuraathede en foutverminderingsvermoëns. Dit was egter getoon dat modellering van die onsekerhede met die onbegrensde Normaal verdeling lei tot die beraming van aanpassings wat so groot is dat negatiewe verstelde waardes kan onstaan na rekosniliasie. Die voordeel om onsekerhede met 'n begrensde distribusie te modelleer, soos die beta distribusie in hierdie geval, is dat die moontlikheid om negatiewe verstelde waardes te verkry uitgelsuit word. ML-verstelde waardes (met die Beta distribusie funksie) sal altyd nie-negatief wees, en om hierdie rede uitvoerbaar. In 'n verdere vergelyking van die ML (begrensd) en die LS metodes in die materiaal balansering van die waterval smelter kompleks, is dit gevind dat vir almal daardie strome waarvan die onserkerhede gesimuleer was met 'n Beta distribusie, dus daardie strome waarvan die onsekerheidsdistribusie ‘n mate van skeefheid toon, die ML verstellings altyd beduidend kleiner was as die ooreenkomstige LS verstellings. Vervolgens word die Maksimum Waarskynlikheid metode (met begrensde modelle) gesien as 'n beter alternatiewe metode van data rekosiliasie in vergelyking met die kleinste kwadrate metode met die voordele van: • Beter beramings te danke aan die feit dat die aard van die onsekerheidsdistribusie nie aangeneem word nie, maar bepaal is deur die distribusie te pas en deur van parameter beraming gebruik te maak. • Die aangepaste waardes kan nooit negatief wees te danke aan die begrensde aard van die verdeling. Die volgende oorspronklike bydraes is gelewer in hierdie verhandeling: • Die Maksimum Waarskynlikheid metode was vir die eerste keer geëvalueer vir massa balans rekonsiliasie van nie-Normaal verspreide data en vergelyk met die bekendde kleinste kwadrate metode. • Dit is die eerste keer geostatistiese metodes geïntegreer is met data rekonsiliasie om onsekerhede te beraam waarbinne verstellings gemaak word. • Vir die eerste keer, is meetonsekerhede gemoddelleer met 'n distribusie wat nie- Normaal en begrensd van aard is, wat lei tot kleiner en meer realistiese verstellings.
Sozen, Serkan. "A Viterbi Decoder Using System C For Area Efficient Vlsi Implementation." Master's thesis, METU, 2006. http://etd.lib.metu.edu.tr/upload/12607567/index.pdf.
Full textChabičovský, Martin. "Statistická analýza rozdělení extrémních hodnot pro cenzorovaná data." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2011. http://www.nusl.cz/ntk/nusl-229487.
Full textGoto, Daniela Bento Fonsechi. "Estimação de maxima verossimilhança para processo de nascimento puro espaço-temporal com dados parcialmente observados." [s.n.], 2008. http://repositorio.unicamp.br/jspui/handle/REPOSIP/306192.
Full textDissertação (mestrado) - Universidade Estadual de Campinas, Instituto de Matematica, Estatistica e Computação Cientifica
Made available in DSpace on 2018-08-11T16:45:43Z (GMT). No. of bitstreams: 1 Goto_DanielaBentoFonsechi_M.pdf: 3513260 bytes, checksum: ff6f9e35005ad9015007d1f51ee722c1 (MD5) Previous issue date: 2008
Resumo: O objetivo desta dissertação é estudar estimação de máxima verossimilhança para processos de nascimento puro espacial para dois diferentes tipos de amostragem: a) quando há observação permanente em um intervalo [0, T]; b) quando o processo é observado após um tempo T fixo. No caso b) não se conhece o tempo de nascimento dos pontos, somente sua localização (dados faltantes). A função de verossimilhança pode ser escrita para o processo de nascimento puro não homogêneo em um conjunto compacto através do método da projeção descrito por Garcia and Kurtz (2008), como projeção da função de verossimilhança. A verossimilhança projetada pode ser interpretada como uma esperança e métodos de Monte Carlo podem ser utilizados para estimar os parâmetros. Resultados sobre convergência quase-certa e em distribuição são obtidos para a aproximação do estimador de máxima verossimilhança. Estudos de simulação mostram que as aproximações são adequadas.
Abstract: The goal of this work is to study the maximum likelihood estimation of a spatial pure birth process under two different sampling schemes: a) permanent observation in a fixed time interval [0, T]; b) observation of the process only after a fixed time T. Under scheme b) we don't know the birth times, we have a problem of missing variables. We can write the likelihood function for the nonhomogeneous pure birth process on a compact set through the method of projection described by Garcia and Kurtz (2008), as the projection of the likelihood function. The fact that the projected likelihood can be interpreted as an expectation suggests that Monte Carlo methods can be used to compute estimators. Results of convergence almost surely and in distribution are obtained for the aproximants to the maximum likelihood estimator. Simulation studies show that the approximants are appropriate.
Mestrado
Inferencia em Processos Estocasticos
Mestre em Estatística
Nguyen, Ngoc B. "Estimation of Technical Efficiency in Stochastic Frontier Analysis." Bowling Green State University / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=bgsu1275444079.
Full textKondlo, Lwando Orbet. "Estimation of Pareto distribution functions from samples contaminated by measurement errors." Thesis, University of the Western Cape, 2010. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_6141_1297831463.
Full textThe intention is to draw more specific connections between certain deconvolution methods and also to demonstrate the application of the statistical theory of estimation in the presence of measurement error. A parametric methodology for deconvolution when the underlying distribution is of the Pareto form is developed. Maximum likelihood estimation (MLE) of the parameters of the convolved distributions is considered. Standard errors of the estimated parameters are calculated from the inverse Fisher&rsquo
s information matrix and a jackknife method. Probability-probability (P-P) plots and Kolmogorov-Smirnov (K-S) goodnessof- fit tests are used to evaluate the fit of the posited distribution. A bootstrapping method is used to calculate the critical values of the K-S test statistic, which are not available.
Horimoto, Andréa Roselí Vançan Russo. "Estimativa do valor da taxa de penetrância em doenças autossômicas dominantes: estudo teórico de modelos e desenvolvimento de um programa computacional." Universidade de São Paulo, 2009. http://www.teses.usp.br/teses/disponiveis/41/41131/tde-15102009-161545/.
Full textThe main objective of this dissertation was the development of a computer program, in Microsoft® Visual Basic® 6.0, for estimating the penetrance rate of autosomal dominant diseases by means of the information contained on genealogies. Some of the algorithms we used in the program were based on ideas already published in the literature by researchers and (post-) graduate students of the Laboratory of Human Genetics, Department of Genetics and Evolutionary Biology, Institute of Biosciences, University of São Paulo. We developed several other methods to deal with particular structures found frequently in the genealogies published in the literature, such as: a) the absence of information on the phenotype of the individual generating of the genealogy; b) the grouping of trees of normal individuals without the separate description of the offspring number per individual; c) the analysis of structures containing consanguineous unions; d) the determination of general solutions in simple analytic form for the likelihood functions of trees of normal individuals with regular branching and for the heterozygosis probabilities of any individual belonging to these trees. In addition to the executable version of the program summarized above, we also prepared, in collaboration with the dissertation supervisor and the undergraduate student Marcio T. Onodera (main author of this particular version), another program, represented by a web version (PenCalc Web). It enables the calculation of heterozygosis probabilities and the offspring risk for all individuals of the genealogy, two details we did not include in the present version of our program. The program PenCalc Web can be accessed freely at the home-page address http://www.ib.usp.br/~otto/pencalcweb. Another important contribution of this dissertation was the development of a model of estimation with generationdependent penetrance rate, as suggested by the inspection of families with some autosomal dominant diseases, such as the ectrodactyly-tibial hemimelia syndrome (ETH), a condition which exhibits a phenomenon similar to anticipation in relation to the penetrance rate. The models with constant and variable penetrance rates, as well as practically all the methods developed in this dissertation, were applied to 21 individual genealogies from the literature with cases of ETH and to the set of all these genealogies (meta-analysis). The corresponding results of all these analysis are comprehensively presented.
El, Matouat Abdelaziz. "Sélection du nombre de paramètres d'un modèle comparaison avec le critère d'Akaike." Rouen, 1987. http://www.theses.fr/1987ROUES054.
Full textKunz, Lukas Brad. "A New Method for Melt Detection on Antarctic Ice-Shelves and Scatterometer Calibration Verification." Diss., CLICK HERE for online access, 2004. http://contentdm.lib.byu.edu/ETD/image/etd527.pdf.
Full textNourmohammadi, Mohammad. "Statistical inference with randomized nomination sampling." Elsevier B.V, 2014. http://hdl.handle.net/1993/30150.
Full textMacková, Simona. "Makroekonomická analýza s využitím postupů prostorové ekonometrie." Master's thesis, Vysoká škola ekonomická v Praze, 2017. http://www.nusl.cz/ntk/nusl-359198.
Full textSvoboda, Ondřej. "Využití Poissonova rozdělení pro předpovědi výsledků sportovních utkání." Master's thesis, Vysoká škola ekonomická v Praze, 2017. http://www.nusl.cz/ntk/nusl-359303.
Full textZhang, Tianyu. "Problème inverse statistique multi-échelle pour l'identification des champs aléatoires de propriétés élastiques." Thesis, Paris Est, 2019. http://www.theses.fr/2019PESC2068.
Full textWithin the framework of linear elasticity theory, the numerical modeling and simulation of the mechanical behavior of heterogeneous materials with complex random microstructure give rise to many scientific challenges at different scales. Despite that at macroscale such materials are usually modeled as homogeneous and deterministic elastic media, they are not only heterogeneous and random at microscale, but they often also cannot be properly described by the local morphological and mechanical properties of their constituents. Consequently, a mesoscale is introduced between macroscale and microscale, for which the mechanical properties of such a random linear elastic medium are represented by a prior non-Gaussian stochastic model parameterized by a small or moderate number of unknown hyperparameters. In order to identify these hyperparameters, an innovative methodology has been recently proposed by solving a multiscale statistical inverse problem using only partial and limited experimental data at both macroscale and mesoscale. It has been formulated as a multi-objective optimization problem which consists in minimizing a (vector-valued) multi-objective cost function defined by three numerical indicators corresponding to (scalar-valued) single-objective cost functions for quantifying and minimizing distances between multiscale experimental data measured simultaneously at both macroscale and mesoscale on a single specimen subjected to a static test, and the numerical solutions of deterministic and stochastic computational models used for simulating the multiscale experimental test configuration under uncertainties. This research work aims at contributing to the improvement of the multiscale statistical inverse identification method in terms of computational efficiency, accuracy and robustness by introducing (i) an additional mesoscopic numerical indicator allowing the distance between the spatial correlation length(s) of the measured experimental fields and the one(s) of the computed numerical fields to be quantified at mesoscale, so that each hyperparameter of the prior stochastic model has its own dedicated single-objective cost-function, thus allowing the time-consuming global optimization algorithm (genetic algorithm) to be avoided and replaced with a more efficient algorithm, such as the fixed-point iterative algorithm, for solving the underlying multi-objective optimization problem with a lower computational cost, and (ii) an ad hoc stochastic representation of the hyperparameters involved in the prior stochastic model of the random elasticity field at mesoscale by modeling them as random variables, for which the probability distributions can be constructed by using the maximum entropy principle under a set of constraints defined by the available and objective information, and whose hyperparameters can be determined using the maximum likelihood estimation method with the available data, in order to enhance both the robustness and accuracy of the statistical inverse identification method of the prior stochastic model. Meanwhile, we propose as well to solve the multi-objective optimization problem by using machine learning based on artificial neural networks. Finally, the improved methodology is first validated on a fictitious virtual material within the framework of 2D plane stress and 3D linear elasticity theory, and then illustrated on a real heterogenous biological material (beef cortical bone) in 2D plane stress linear elasticity
Lucci, Lisa. "Valutazione della resistenza a fatica di provini in Maraging Steel realizzati in Additive Manufacturing." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2020. http://amslaurea.unibo.it/19826/.
Full textChabot, John Alva. "VALIDATING STEADY TURBULENT FLOW SIMULATIONS USING STOCHASTIC MODELS." Miami University / OhioLINK, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=miami1443188391.
Full textSonono, Masimba Energy. "Applications of conic finance on the South African financial markets /| by Masimba Energy Sonono." Thesis, North-West University, 2012. http://hdl.handle.net/10394/9206.
Full textThesis (MSc (Risk Analysis))--North-West University, Potchefstroom Campus, 2013.
Ulgen, Burcin Emre. "Estimation In The Simple Linear Regression Model With One-fold Nested Error." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/3/12606171/index.pdf.
Full textLuo, Hao. "Some Aspects on Confirmatory Factor Analysis of Ordinal Variables and Generating Non-normal Data." Doctoral thesis, Uppsala universitet, Statistiska institutionen, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-149423.
Full textRendas, Luís Manuel Pinto. "Estimação da densidade populacional em amostragem por transectos lineares com recurso ao modelo logspline." Master's thesis, Universidade de Évora, 2001. http://hdl.handle.net/10174/15256.
Full textReichmanová, Barbora. "Užití modelů diskrétních dat." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2018. http://www.nusl.cz/ntk/nusl-392846.
Full textXu, Xingbai Xu. "Asymptotic Analysis for Nonlinear Spatial and Network Econometric Models." The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1461249529.
Full textMishchenko, Kateryna. "Numerical Algorithms for Optimization Problems in Genetical Analysis." Doctoral thesis, Västerås : Scool of education, Culture and Communication, Mälardalen University, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-650.
Full text