To see the other types of publications on this topic, follow the link: Method references.

Dissertations / Theses on the topic 'Method references'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Method references.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lally, Evan M. "Fourier Transform Interferometry for 3D Mapping of Rough and Discontinuous Surfaces." Diss., Virginia Tech, 2010. http://hdl.handle.net/10919/27542.

Full text
Abstract:
Of the wide variety of existing optical techniques for non-contact 3D surface mapping, Fourier Transform Interferometry (FTI) is the method that most elegantly combines simplicity with high speed and high resolution. FTI generates continuous-phase surface maps from a projected optical interference pattern, which is generated with a simple double-pinhole source and collected in a single snapshot using conventional digital camera technology. For enhanced stability and reduced system size, the fringe source can be made from a fiber optic coupler. Unfortunately, many applications require mapping of surfaces that contain challenging features not ideally suited for reconstruction using FTI. Rough and discontinuous surfaces, commonly seen in applications requiring imaging of rock particles, present a unique set of obstacles that cannot be overcome using existing FTI techniques. This work is based on an original analysis of the limitations of FTI and the means in which errors are generated by the particular features encountered in the aggregate mapping application. Several innovative solutions have been developed to enable the use of FTI on rough and discontinuous surfaces. Through filter optimization and development of a novel phase unwrapping and referencing technique, the Method of Multiple References (MoMR), this work has enabled surface error correction and simultaneous imaging of multiple particles using FTI. A complete aggregate profilometry system has been constructed, including a MoMR-FTI software package and graphical user interface, to implement these concepts. The system achieves better than 22µm z-axis resolution, and comprehensive testing has proven it capable to handle a wide variety of particle surfaces. A range of additional features have been developed, such as error correction, particle boundary mapping, and automatic data quality windowing, to enhance the usefulness of the system in its intended application. Because of its high accuracy, high speed and ability to map varied particles, the developed system is ideally suited for large-scale aggregate characterization in highway research laboratories. Additionally, the techniques developed in this work are potentially useful in a large number of applications in which surface roughness or discontinuities pose a challenge.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
2

Amaral, Laura Guidali. "A influência de imagens indiretas como fonte de inspiração no processo criativo de design." Universidade do Vale do Rio dos Sinos, 2013. http://www.repositorio.jesuita.org.br/handle/UNISINOS/3687.

Full text
Abstract:
Submitted by Mariana Dornelles Vargas (marianadv) on 2015-05-27T15:38:29Z No. of bitstreams: 1 influencia_imagens.pdf: 14682107 bytes, checksum: dc2cff25918154d890f4213f219ac205 (MD5)
Made available in DSpace on 2015-05-27T15:38:29Z (GMT). No. of bitstreams: 1 influencia_imagens.pdf: 14682107 bytes, checksum: dc2cff25918154d890f4213f219ac205 (MD5) Previous issue date: 2013-03-27
CAPES - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Uma das principais características do conceito de Design Estratégico utilizado nesta pesquisa é o uso de um método de projeto que propõe uma etapa denominada metaprojeto - um espaço para discussão do problema de design. Nesta etapa, utliza-se um tipo especial de pesquisa chamado Pesquisa Blue Sky. Esta é composta essencialmente de referências visuais externas ao contexto do que se está projetando, e busca revelar tendências e apresentar estímulos úteis para o projeto. Na presente investigação busca-se refletir sobre o papel de imagens não relacionadas ao problema de projeto - referências indiretas - como fontes de inspiração no processo criativo de design. Para tanto se aborda a questão do problema de projeto mal estruturado e o seu processo de resolução; o pensamento criativo; e o uso da imagem como referência. É proposto um método de pesquisa de caráter exploratório que inclui o desenvolvimento de lincográficos e visa avaliar como estas imagens específicas são utilizadas durante o processo projetual. Apresentam-se indícios que o uso de imagens indiretas auxilia na formulação de ideias mais conceituais por aumentar o tempo de reflexão dos indivíduos sobre o problema e sobre a solução.
One of the main characteristics of the Strategic Design concept adopted in this research is the use of a design method that proposes a stage called metadesign - a space for discussion of the design problem. At this stage, a special type of tool is used, called Blue Sky research. It is essentially composed by visual references, that are external to the design context, and reveals trends and offers useful stimuli in the project. The purpose of this investigation is to reflect about the role of images not directly related to the design problems - indirect references - as inspiration sources for the creative design process. To do so, several aspects are reviewed, such as the "Wicked problem" and it's solution process; creative process; and usage of images as reference. A exploratory research, which includes the development of linkographs, is proposed in order to evaluate how these specific images are used during the project process. Evidence is presented, showing that indirect references help the formulation of conceptual ideas, by increasing the reflection time of the individuals about the problem and about the solution.
APA, Harvard, Vancouver, ISO, and other styles
3

Rytych, Maxim. "Možnosti deklarativního programování v jazyku Java 8." Master's thesis, Vysoká škola ekonomická v Praze, 2015. http://www.nusl.cz/ntk/nusl-202113.

Full text
Abstract:
This paper concerns itself with possibilities of declarative programming in the new version of Java 8 language, specifically using elements adopted from the domain of functional programming languages: function as a value and lazy streams of data. The goal of this paper is to demonstrate possibilities of declarative programming using these elements, analyze its implementation and design own extensions. The contribution lies particularly in showing possibilities of the new elements, implementation analysis and design of a new functionality. The output can be used by a Czech reader, who is at least slightly advanced in the field of information technologies. The paper is divided into a theoretical and practical parts. Theoretical part is covered by chapters 3-8. Theoretical part describes motivation for introduction of the new elements, describes functional programming and its basic principles, then it shows basic principles of the newly introducted elements and ends with the description of the java.util.stream package. Pactical part is covered by chapters 9 and 10. Practical part concerns with stream oper-ations and extension design of existing functionality.
APA, Harvard, Vancouver, ISO, and other styles
4

MacFadden, James. "Computational methods for incompressible fluid flows, with reference to interface modelling by an extended finite element method." Thesis, Swansea University, 2006. https://cronfa.swan.ac.uk/Record/cronfa42810.

Full text
Abstract:
In this thesis an implicit Semi-Discrete Stabilized eXtended Finite Element formulation has been successfully developed and implemented for laminar Newtonian incompressible fluid flows. In doing so we have contributed to the research into the field of incompressible fluid flows, multiphase flow and fluid-rigid body interaction. The fluid flows are governed by the incompressible viscous Navier-Stokes equations, using a Finite Element formulation to model the fluid behaviour numerically. A Semi-Discrete time integration scheme was implemented, discretizing in space, leaving the system of ordinary differential equations to be integrated in time. Initially the classical Galerkin method is used to formulate the boundary value problem from the governing equations, however stability issues due to incompressibility and dominant advection terms force the implementation of the stabilized formulation, i.e. SUPG/PSPG. This approach gives greater flexibility in choice of velocity/pressure interpolations, such as equal order functions. The time integration schemes (Generalized alpha method and Generalized Midpoint rule) were compared and contrasted, with the Generalized alpha method demonstrating improved convergence. The highly nonlinear form of the governing equations required an implicit iterative solver and the Newton-Raphson procedure was chosen. Several tests were performed throughout the formulation of the boundary value problem to validate the implementation. The result, a robust, efficient and accurate unsteady incompressible Newtonian fluid formulation. extended FEM was introduced by adding terms to the FEM formulation in a Partition of Unity framework. With the addition of complex solution procedures X-FEM was implemented and tested for multiphase and fluid-rigid body interaction, demonstrating the attractive qualities of this method.
APA, Harvard, Vancouver, ISO, and other styles
5

Jeffrey, Chris C. "Applications of Single Reference Methods to Multi-Reference Problems." Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc801919/.

Full text
Abstract:
Density functional theory is an efficient and useful method of solving single-reference computational chemistry problems, however it struggles with multi-reference systems. Modifications have been developed in order to improve the capabilities of density functional theory. In this work, density functional theory has been successfully applied to solve multi-reference systems with large amounts of non-dynamical correlation by use of modifications. It has also been successfully applied for geometry optimizations for lanthanide trifluorides.
APA, Harvard, Vancouver, ISO, and other styles
6

Burkhart, Joshua. "A Method for Reference-Free Genome Assembly Quality Assessment." Thesis, University of Oregon, 2013. http://hdl.handle.net/1794/13338.

Full text
Abstract:
How to assess the quality of a genome assembly without the help of a reference sequence is an open question. Only a few techniques are currently used in the literature and each has obvious bias. An additional method, restriction enzyme associated DNA (RAD) marker alignment, is proposed here. With high enough density, this method should be able to assess the quality of de novo assemblies without the biases of current methods. With the growing ambition to sequence new genomes and the accelerating ability to do so cost effectively, methods to assess the quality of reference-free genome assemblies will become increasingly important. In addition to the existing methods of EST and conserved sequence alignment, RAD marker alignment may contribute to this effort.
APA, Harvard, Vancouver, ISO, and other styles
7

Hassan, Saadia Bashir. "Methods for Preclinical Evaluation of Cytotoxic Drugs : With Special Reference to the Cyanoguanidine CHS 828 and Hollow Fiber Method." Doctoral thesis, Uppsala : Acta Universitatis Upsaliensis : Univ.-bibl. [distributör], 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-4696.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Abrams, Micah Lowell. "General-Order Single-Reference and Mulit-Reference Methods in Quantum Chemistry." Diss., Georgia Institute of Technology, 2005. http://hdl.handle.net/1853/6852.

Full text
Abstract:
Many-body perturbation theory and coupled-cluster theory, combined with carefully constructed basis sets, can be used to accurately compute the properties of small molecules. We applied a series of methods and basis sets aimed at reaching the ab initio limit to determine the barrier to planarity for ethylene cation. For potential energy surfaces corresponding to bond dissociation, a single Slater determinant is no longer an appropriate reference, and the single-reference hierarchy breaks down. We computed full configuration interaction benchmark data for calibrating new and existing quantum chemical methods for the accurate description of potential energy surfaces. We used the data to calibrate single-reference configuration interaction, perturbation theory, and coupled-cluster theory and multi-reference configuration interaction and perturbation theory, using various types of molecular orbitals, for breaking single and multiple bonds on ground-state and excited-state surfaces. We developed a determinant-based method which generalizes the formulation of many-body wave functions and energy expectation values. We used the method to calibrate single-reference and multi-reference configuration interaction and coupled-cluster theories, using different types of molecular orbitals, for the symmetric dissociation of water. We extended the determinant-based method to work with general configuration lists, enabling us to study, for the first time, arbitrarily truncated coupled-cluster wave functions. We used this new capability to study the importance of configurations in configuration interaction and coupled-cluster wave functions at different regions of a potential energy surface.
APA, Harvard, Vancouver, ISO, and other styles
9

Gushchin, Ivan. "The modal method : a reference method for modeling of the 2D metal diffraction gratings." Thesis, Saint-Etienne, 2011. http://www.theses.fr/2011STET4010/document.

Full text
Abstract:
Les éléments de diffraction sont largement utilisés aujourd'hui dans un nombre grandissant d'applications grâce à la progression des technologies de microstructuration dans le sillage de la micro-électronique. Pour un design optimal de ces éléments, des méthodes de modélisation précises sont nécessaires. Plusieurs méthodes ont été développées et sont utilisées avec succès pour des réseaux de diffraction unidimensionnel de différents types. Cependant, les méthodes existantes pour les réseaux deux dimensionnel ne couvrent pas tous types de structures possibles. En particulier, le calcul de l'efficacité de diffraction sur les réseaux métalliques à deux dimensionnel avec parois verticales représente encore une grosse difficulté pour les méthodes existantes. Le présent travail a pour objectif le développement d'une méthode exacte de calcul de l'efficacité de diffraction de tels réseaux qui puisse servir de référence. La méthode modale développée ici - dénommée ,,true-mode" en anglais - exprime le champ électromagnétique sur la base des vrais modes électromagnétiques satisfaisant les conditions limites de la structure 2D à la différence d'une méthode modale où les modes sont ceux d'une structure approchée obtenue, par exemple, par développement de Fourier. L'identification et la représentation de ces vrais modes à deux dimensions restait à faire et ce n'est pas le moindre des résultats du présent travail que d'y avoir conduit. Les expressions pour la construction du champ sont données avec des exemples de résultats concrets. Sont aussi fournies les équations pour le calcul des intégrales de recouvrement et des éléments de la matrice de diffusion
Diffractive elements are widely used in many applications now as the microstructuring technologies are making fast progresses in the wake of microelectronics. For the optimization of these elements accurate modeling methods are needed. There exists well-developed and widely used methods for one-dimensional diffraction gratings of different types. However, the methods available for solving two-dimensional periodic structures do not cover all possible grating types. The development of a method to calculate the diffraction efficiency of two dimensional metallic gratings represents the objective of this work. The one-dimensional true-mode method is based on the representation of the field inside the periodic element as a superposition of particular solutions, each one of them satisfying exactly the boundary conditions. In the developed method for the two-dimensional gratings the representation of the field within the grating in such way is used. In the present work, the existing modal methods for one-dimensional gratings can be used as the basis for the construction of the modal field distribution functions within two-dimensional gratings. The modal function distributions allow to calculate the overlap integrals of the fields outside the grating with those within the structure. The transition matrix coefficients are formed on the basis of these integrals. The final stage is the calculation of the scattering matrix based on two transition matrices. The equations for the field reconstruction are provided and accompanied by examples of results. Further equations used to calculate the overlap integrals and scattering matrix coefficients are provided
APA, Harvard, Vancouver, ISO, and other styles
10

Di, Girolamo Nicola <1987&gt. "Method-Comparison and Reference Interval Determination in Animal Medicine." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2016. http://amsdottorato.unibo.it/7650/.

Full text
Abstract:
An acceptable agreement permits interchangeability of the instruments. For this purpose, we have investigated the agreement of several clinical instruments frequently used in clinical practice with their laboratory counterpart. We have estimated the agreement between a point-of-care blood gas analyzer (i-Stat, Abaxis) and a bench-top blood gas analyzer (Nova, Biomedical) in venous samples from Hermann’s tortoises. We have estimated the agreement between a point-of-care chemistry analyzer (VetScan VS2, Abaxis) and a laboratory analyzer (Olympus AU400, Olympus Co.) in venous samples from Hermann’s tortoises. We have estimated the agreement between portable blood glucose meters (Accu-Chek, Aviva; AlphaTrak 2, Abbott) and a laboratory analyzer (Dimension EXL, Siemens) in venous samples from client-owned rabbits. We have estimated the agreement between point-of-care bench-top glucose measurement (VetScan VS2, Abaxis) and a laboratory analyzer (Dimension EXL, Siemens) in venous samples from client-owned rabbits. Beyond method comparison and validation, reference interval determination for common laboratory testing is required to allow the clinician to discriminate individuals that are different from the remaining population for a certain parameter. We have calculated reference intervals for blood gas in Hermann’s tortoises. We have calculated reference intervals for protein electrophoresis in Hermann’s tortoises. We have described normal hematology in Hermann’s tortoises. We have calculated reference intervals for clinical chemistry in Hermann’s tortoises. We have calculated reference intervals for aldosterone in ferrets. Based on our results, animal species requires individual validation of laboratory methods and reference intervals. Lack of consideration of these findings may result in clinical misdiagnosis and improper treatment of animals.
APA, Harvard, Vancouver, ISO, and other styles
11

Lin, Hui-Fen. "A Comparison of Three Item Selection Methods in Criterion-Referenced Tests." Thesis, University of North Texas, 1988. https://digital.library.unt.edu/ark:/67531/metadc332327/.

Full text
Abstract:
This study compared three methods of selecting the best discriminating test items and the resultant test reliability of mastery/nonmastery classifications. These three methods were (a) the agreement approach, (b) the phi coefficient approach, and (c) the random selection approach. Test responses from 1,836 students on a 50-item physical science test were used, from which 90 distinct data sets were generated for analysis. These 90 data sets contained 10 replications of the combination of three different sample sizes (75, 150, and 300) and three different numbers of test items (15, 25, and 35). The results of this study indicated that the agreement approach was an appropriate method to be used for selecting criterion-referenced test items at the classroom level, while the phi coefficient approach was an appropriate method to be used at the district and/or state levels. The random selection method did not have similar characteristics in selecting test items and produced the lowest reliabilities, when compared with the agreement and the phi coefficient approaches.
APA, Harvard, Vancouver, ISO, and other styles
12

Oliphant, Nevin Horace. "A multireference coupled-cluster method using a single-reference formalism." Diss., The University of Arizona, 1991. http://hdl.handle.net/10150/185629.

Full text
Abstract:
The coupled-cluster (CC) equations including single, double, triple and quadruple excitations (CCSDTQ) are qraphically derived using Feynman diagrams. These equations are programmed and an iterative reduced linear equation method is used to solve these equations. A few points on the potential curves for the dissociation of some model systems with a single bond (LiH and Li₂) are calculated using CC doubles (CCD), singles and doubles (CCSD), singles, doubles and triples (CCSDT) and CCSDTQ. These calculations demonstrate the magnitude of the CC contributions arising from triple and quadruple excitation amplitudes to the stretching of a chemical bond. A multi-reference coupled-cluster singles and doubles (MRCCSD) method utilizing two reference determinants, which differ by a two electron excitation, is then proposed. One of these determinants is selected as the formal reference determinant. The proposed method is based on the single-reference coupled-cluster equations truncated after quadruples with appropriate restrictions placed on the triple and quadruple amplitudes to allow only those amplitudes which correspond to single and double excitations from the second reference determinant. The computational expense of this method is no more than twice that of singles and doubles from a single reference (CCSD). These equations are programmed and the potential curves for the dissociation of a few model systems with single bonds (LiH, BH, and H₂O) are calculated to demonstrate the correct bond dissociation properties of this method. These calculations also demonstrate how much of the CC energy contribution arising from the triple and quadruple excitation amplitudes can be attributed to single and double excitations from the second reference determinant.
APA, Harvard, Vancouver, ISO, and other styles
13

Shahid, Muhammad. "Methods for Objective and Subjective Video Quality Assessment and for Speech Enhancement." Doctoral thesis, Blekinge Tekniska Högskola [bth.se], Faculty of Engineering - Department of Applied Signal Processing, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-00603.

Full text
Abstract:
The overwhelming trend of the usage of multimedia services has raised the consumers' awareness about quality. Both service providers and consumers are interested in the delivered level of perceptual quality. The perceptual quality of an original video signal can get degraded due to compression and due to its transmission over a lossy network. Video quality assessment (VQA) has to be performed in order to gauge the level of video quality. Generally, it can be performed by following subjective methods, where a panel of humans judges the quality of video, or by using objective methods, where a computational model yields an estimate of the quality. Objective methods and specifically No-Reference (NR) or Reduced-Reference (RR) methods are preferable because they are practical for implementation in real-time scenarios. This doctoral thesis begins with a review of existing approaches proposed in the area of NR image and video quality assessment. In the review, recently proposed methods of visual quality assessment are classified into three categories. This is followed by the chapters related to the description of studies on the development of NR and RR methods as well as on conducting subjective experiments of VQA. In the case of NR methods, the required features are extracted from the coded bitstream of a video, and in the case of RR methods additional pixel-based information is used. Specifically, NR methods are developed with the help of suitable techniques of regression using artificial neural networks and least-squares support vector machines. Subsequently, in a later study, linear regression techniques are used to elaborate the interpretability of NR and RR models with respect to the selection of perceptually significant features. The presented studies on subjective experiments are performed using laboratory based and crowdsourcing platforms. In the laboratory based experiments, the focus has been on using standardized methods in order to generate datasets that can be used to validate objective methods of VQA. The subjective experiments performed through crowdsourcing relate to the investigation of non-standard methods in order to determine perceptual preference of various adaptation scenarios in the context of adaptive streaming of high-definition videos. Lastly, the use of adaptive gain equalizer in the modulation frequency domain for speech enhancement has been examined. To this end, two methods of demodulating speech signals namely spectral center of gravity carrier estimation and convex optimization have been studied.
APA, Harvard, Vancouver, ISO, and other styles
14

Muaksang, Kittiya Chemistry Faculty of Science UNSW. "Development of reference methods and reference materials for trace level antibiotic residues in food using IDMS." Awarded by:University of New South Wales. Chemistry, 2009. http://handle.unsw.edu.au/1959.4/43262.

Full text
Abstract:
Food additives, drugs and growth-enhancing compounds are prominent tools in the production of sufficient quantities of affordable food. Thus, food safety has become a main concern of many countries for protection of their population's health. Analytical chemical measurements are increasingly important to ensure consumer protection, particularly in the field of antibiotic residues. International comparability and reliability of measurement results can be achieved by establishing and demonstrating the metrological traceability of those results to the International System of Units (SI). To ensure the metrological traceability of measurement results, reference methods and certified reference materials (CRMs) are needed. Nitrofuran antibiotic drugs used for the treatment of bacterial and protozoan infections in animals were studied in this thesis. A high accuracy reference method and an appropriate CRM for the detection of nitrofurans in prawns have been developed. A reference method for the measurement of mass fractions of nitrofuran metabolites 3-amino-5-methyl-morpholino-2-oxazolidinone (AMOZ), 3-amino-2-oxazolidinone (AOZ), semicarbazide (SEM) and 1-aminohydantoin (AHD) has been developed utilising an exact matching double isotope dilution mass spectrometry (IDMS) method. A feasibility study on a fortified reference material was carried out by preparing fortified samples containing the four nitrofuran metabolites. The stability testing results showed that the analytes in the freeze-dried matrix were more stable than in the wet form. Freeze-dried certified reference materials of nitrofurans in prawns have been produced; an incurred AOZ (CRM_P1) and incurred AOZ and fortified SEM (CRM_P2). The reference method developed was applied for the characterisation of certified reference materials for homogeneity, stability study and certification. The prepared certified reference materials were found to be homogeneous and remained stable under normal transport conditions. The measurement uncertainty of the measurement results obtained by the reference method developed was determined by thoroughly examining all possible sources of potential bias and precision effects. An initial measurement uncertainty for certified values of AOZ in both materials has also been estimated. Metrological traceability of measurement results obtained by the developed reference method and the reference value of the prepared certified reference materials has been established through the use of the traceable primary ratio method of exact matching double IDMS, the use of certified nitrofuran metabolite standards, and gravimetric preparation of samples.
APA, Harvard, Vancouver, ISO, and other styles
15

Vähänikkilä, H. (Hannu). "Statistical methods in dental research, with special reference to time-to-event methods." Doctoral thesis, Oulun yliopisto, 2015. http://urn.fi/urn:isbn:9789526207933.

Full text
Abstract:
Abstract Statistical methods are an essential part of the published dental research. It is important to evaluate the use of these methods to improve the quality of dental research. In the first part, the aim of this interdisciplinary study is to investigate the development of the use of statistical methods in dental journals, quality of statistical reporting and reporting of statistical techniques and results in dental research papers, with special reference to time-to-event methods. In the second part, the focus is specifically on time-to-event methods, and the aim is to demonstrate the strength of time-to-event methods in collecting detailed data about the development of oral health. The first part of this study is based on an evaluation of dental articles from five dental journals. The second part of the study is based on empirical data from 28 municipal health centres in order to study variations in the survival of tooth health. There were different profiles in the statistical content among the journals. The quality of statistical reporting was quite low in the journals. The use of time-to-event methods has increased from 1996 to 2007 in the evaluated dental journals. However, the benefits of these methods have not been fully adopted in dental research. The current study added new information regarding the status of statistical methods in dental research. Our study also showed that complex time-to-event analysis methods can be utilized even with detailed information on each tooth in large groups of study subjects. Authors of dental articles might apply the results of this study to improve the study protocol/planning as well as the statistical section of their research article
Tiivistelmä Tilastolliset tutkimusmenetelmät ovat olennainen osa hammaslääketieteellistä tutkimusta. Menetelmien käyttöä on tärkeä tutkia, jotta hammaslääketieteen tutkimuksen laatua voitaisiin parantaa. Tämän poikkitieteellisen tutkimuksen ensimmäisessä osassa tavoite on tutkia erilaisten tilastomenetelmien ja tutkimusasetelmien käyttöä, raportoinnin laatua ja tapahtumaan kuluvan ajan analysointimenetelmien käyttöä hammaslääketieteellisissä artikkeleissa. Toisessa osassa osoitetaan analysointimenetelmien vahvuus isojen tutkimusjoukkojen analysoinnissa. Ensimmäisen osan tutkimusaineiston muodostavat viiden hammaslääketieteellisen aikakauslehden artikkelit. Toisen osan tutkimusaineiston muodostivat 28 terveyskeskuksessa eri puolella Suomea hammashoitoa saaneet potilaat. Lehdet erosivat toisistaan tilastomenetelmien käytön ja tulosten esittämisen osalta. Tilastollisen raportoinnin laatu oli lehdissä puutteellinen. Tapahtumaan kuluvan ajan analysointimenetelmien käyttö on lisääntynyt vuosien 1996–2007 aikana. Tapahtumaan kuluvan ajan analysointimenetelmät mittaavat seuranta-ajan tietystä aloituspisteestä määriteltyyn päätepisteeseen. Tämän väitöksen tutkimukset osoittivat, että tapahtumaan kuluvan ajan analysointimenetelmät sopivat hyvin isojen tutkimusjoukkojen analysointiin. Menetelmien hyötyä ei ole kuitenkaan vielä saatu täysin esille hammaslääketieteellisissä julkaisuissa. Tämä tutkimus antoi uutta tietoa tilastollisten tutkimusmenetelmien käytöstä hammaslääketieteellisessä tutkimuksessa. Artikkelien kirjoittajat voivat hyödyntää tämän tutkimuksen tuloksia suunnitellessaan hammaslääketieteellistä tutkimusta
APA, Harvard, Vancouver, ISO, and other styles
16

Kelly, Patricia McGilvray. "Proposed reference method for the measurement of ionized calcium in blood." Thesis, University of Newcastle Upon Tyne, 1993. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.332246.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

PANDA, Ganesh Prasad. "THE METHOD OF INDIAN LEXICOGRAPHICAL PRESENTATION : WITH SPECIAL REFERENCE TO THE AMARAKOŚA." 名古屋大学印度哲学研究室 (Department of Indian Philosophy, University of Nagoya), 1995. http://hdl.handle.net/2237/19194.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Marstorp, Gustav. "Automated Control System for Dust Concentration Measurements Using European Standard Reference Method." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-292583.

Full text
Abstract:
Most companies that have any type of combustion or other pollution process via emission to air needs to measure their emissions to ensure they are within legal boundaries. Among the different types of pollution measurements, one of the most common is dust concentration, also known as particle concentration. An important factor in dust concentration measurements is to ensure that the concentration of the measured dust is representative to the dust concentration in the emissions. This is measured in isokinetic deviation, defined as (vn 􀀀 vd)=vd, where vn is the velocity in the entry nozzle and vd the velocity in the duct. Methods of dust concentration measurements used today are dependent on manual tuning and sensor readings, and the isokinetic deviation is calculated after a test. The focus of this project was therefore to investigate how the process of dust concentration measurements using standard reference methods could be automated in the way that isokinetic sampling is controlled and regulated by an automated control system in real time. Pressures, temperatures and sampled gas volume were quantized. A PIDcontroller was designed, implemented and tested. The PID-controller took the differential pressure between the inside of the entry nozzle and the duct, called zero pressure, as input. The system was tested in a laboratory environment by letting a radial fan create a flow, and thus create a zero pressure of -60 Pa, meaning that the pressure in the duct was 60 Pa greater than the pressure inside the entry nozzle. The PID-controller was then enabled and ran for five minutes. The result showed that the PID-controller managed to control the system to the reference point in less than 50 seconds for entry nozzles of diameters 6 mm, 8 mm, 10 mm and 12 mm. The results of the isokinetic deviations were -12 %, -5 %, -6 % and -4 % for entry nozzles with diameters 6 mm, 8 mm, 10 mm and 12 mm respectively. This is higher than the accepted values according to the European standard, which allows deviations in the interval -5%to 15%. However, these tests ran for relatively short time periods and started with large deviations which made it difficult to reach an isokinetic deviaiton in the accepted interval. Possible improvements could be to include the real time isokinetic deviation in the PID-controller, this would make it possible to change the reference value of the zero pressure in real time and guarantee isokinetic deviations in the accepted interval, even in extraordinary situations.
EU-regler ställer krav på anläggningar att kontrollera och begränsa sina utsläpp av stoft enligt EU standard 13284-1:2017. Vid en stoftmätning måste det tas hänsyn till många parametrar, där en av de viktigaste parametrarna är att provtagningen ska utföras isokinetiskt. Isokinetisk provtagning innebär att hastigheten i kanalen (skorstenen) är samma som i sonden där provgasen sugs ut. Dagens metoder för stoftmätning förlitar sig på manuella inställningar och den isokinetiska avvikelsen beräknas efter ett test. Det resulterade i frågeställnigen hur en automatiserad metod för bestämning av masskoncentration av stoft kan utformas så att den isokinetiska avvikelsen beräknas i realtid. Tryck, temperatur och gasvolym kvantiserades från analoga sensorer och kommunicerades till en mikrokontroller med det seriella protokollet I2C. En PID-reglator designades, implementerades och testades. PID-regulatorn tog tryckskillnaden mellan kanal och sond som insignal. Utsignalen från PID-regulatorn var en spänning som via en motordriven ventil kontrollerade inflödet i munstycket. Systemet testades i laborativ miljö genom att låta en fläkt skapa ett flöde tills den uppmätta tryckskillnaden mellan sond och kanal var -60 Pa. Därefter aktiverades PID-regulatorn och testet pågick sedan i fem minuter. Testet utfördes för munstycken med diameterna 6 mm, 8 mm, 10 mm och 12 mm. Resultatet visade att PID-regulatorn styrde systemet till referenspunkten på mindre än 50 sekunder för samtliga diametrar på munstyckena. De isokinetiska avvikelserna (skillnaden i hastighet mellan munstycke och kanal) beräknades till -12 %, -5 %, -6 % och -4 % för munstyckena 6 mm, 8 mm, 10 mm och 12 mm. I två av fallen var det högre än det accepterade värdet enligt EU standarden som tillåter avvikelser inom intervallet -5 % till 15 %. Det kan förklaras av att testen utfördes under en relativ kort tidsperiod och startades med stora avvikelser. Regulatorn skulle dock kunna förbättras genom att använda testets aktuella isokinetiska avvikelse och med den informationen bestämma systemets referenspunkt. Det skulle göra det möjligt att kompensera för tidigare avvikelser och på det sättet uppnå isokinetiska avvikelser inom tillåtet intervall även för extremfall.
APA, Harvard, Vancouver, ISO, and other styles
19

Götze, Jana. "Talk the walk : Empirical studies and data-driven methods for geographical natural language applications." Doctoral thesis, KTH, Tal, musik och hörsel, TMH, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-186358.

Full text
Abstract:
Finding the way in known and unknown city environments is a task that all pedestrians carry out regularly. Current technology allows the use of smart devices as aids that can give automatic verbal route directions on the basis of the pedestrian's current position. Many such systems only give route directions, but are unable to interact with the user to answer clarifications or understand other verbal input. Furthermore, they rely mainly on conveying the quantitative information that can be derived directly from geographic map representations: 'In 300 meters, turn into High Street'. However, humans are reasoning about space predominantly in a qualitative manner, and it is less cognitively demanding for them to understand route directions that express such qualitative information, such as 'At the church, turn left' or 'You will see a café'. This thesis addresses three challenges that an interactive wayfinding system faces in the context of natural language generation and understanding: in a given situation, it must decide on whether it is appropriate to give an instruction based on a relative direction, it must be able to select salient landmarks, and it must be able to resolve the user's references to objects. In order to address these challenges, this thesis takes a data-driven approach: data was collected in a large-scale city environment to derive decision-making models from pedestrians' behavior. As a representation for the geographical environment, all studies use the crowd-sourced Openstreetmap database. The thesis presents methodologies on how the geographical and language data can be utilized to derive models that can be incorporated into an automatic route direction system.

QC 20160516

APA, Harvard, Vancouver, ISO, and other styles
20

Chan, Sun Fat. "Advancement in robot programming with specific reference to graphical methods." Thesis, Loughborough University, 1989. https://dspace.lboro.ac.uk/2134/7281.

Full text
Abstract:
This research study is concerned with the derivation of advanced robot programming methods. The methods include the use of proprietary simulation modelling and design software tools for the off-line programming of industrial robots. The study has involved the generation of integration software to facilitate the co-operative operation of these software tools. The three major researcli'themes7of "ease of usage", calibration and the integration of product design data have been followed to advance robot programming. The "ease of usage" is concerned with enhancements in the man-machine interface for robo t simulation systems in terms of computer assisted solid modelling and computer assisted task generation. Robot simulation models represent an idealised situation, and any off-line robot programs generated from'them may contain'discrepancies which could seriously effect thq programs' performance; Calibration techniques have therefore been investigated as 'a method of overcoming discrepancies between the simulation model and the real world. At the present time, most computer aided design systems operate as isolated islands of computer technology, whereas their product databases should be used to support decision making processes and ultimately facilitate the generation of machine programs. Thus the integration of product design data has been studied as an important step towards truly computer integrated manufacturing. The functionality of the three areas of study have been generalised and form the basis for recommended enhancements to future robot programming systems.
APA, Harvard, Vancouver, ISO, and other styles
21

Haff, G. Gregory, and Michael H. Stone. "Methods of Developing Power With Special Reference to Football Players." Digital Commons @ East Tennessee State University, 2015. https://dc.etsu.edu/etsu-works/4631.

Full text
Abstract:
Power-generating capacity should be a primary training outcome for football athletes. The ability to be explosive and use high levels of strength seems to differentiate between athletes and teams. Developing training interventions that can improve both strength- and power-generating capacity would therefore be considered a paramount endeavor when attempting to optimize the physiological and performance adaptations necessary for competitive success. Too often, strength and conditioning coaches forget that the foundation of powergenerating capacity is in fact high levels of muscular strength. When the development of strength is minimized or excluded from the training plan, the ability to express high-power outputs is compromised. In addition, a failure to use sequenced and integrated training programs decreases the possibility of successfully increasing strength- and power-generating capacity, thus decreasing the potential for competitive success. Therefore, this brief review attempts to explain how strength- and powergenerating capacity can be enhanced to increase the potential for developing the physiological and performance foundation necessary for competitive success with the football athlete.
APA, Harvard, Vancouver, ISO, and other styles
22

Suhardi, Idwan. "Development of method of coastal geomorphological analysis with reference to selected Indonesian coasts." Thesis, University of Portsmouth, 2000. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.343335.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Dam, Charlotte Elgaard. "Particle image velocimetry, accuracy of the method with particular reference to turbulent flows /." Thesis, University of Edinburgh, 1995. http://webex.lib.ed.ac.uk/homes/dam95.html.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

MARQUES, BIANCA DE SOUZA ROSSINI. "DEVELOPMENT AND CERTIFICATION BY THE PRIMARY METHOD OF REFERENCE MATERIAL OF CONDUCTIVITY PRIMARY." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=16710@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
A condutividade eletrolítica é a capacidade de medir o transporte de íons de uma solução. A rastreabilidade é o pré-requisito para a comparabilidade e a uniformidade das medições. No caso das medições de condutividade eletrolítica em soluções, a rastreabilidade é obtida por um sistema primário de condutividade que dará origem aos materiais de referência certificados (MRC) primários. Os MRC são usados para o controle e garantia da qualidade de resultados analíticos, além disso, são essenciais para calibração de instrumentos assegurando a rastreabilidade e confiabilidade dos resultados. A principal motivação para o desenvolvimento deste trabalho é a carência de MRC de baixa condutividade eletrolítica primário devido a sua relevância no controle da pureza da água, matéria-prima para a produção de medicamentos e vacina, além da qualidade do álcool combustível. O objetivo desta dissertação foi desenvolver e certificar um material de referência primário de condutividade eletrolítica de valor nominal de 5 uS.cm(-1) a 25 °C, produzida a partir do sal KCl em 30% (m/m) de 1-propanol. Os estudos de homogeneidade, caracterização e estabilidade, foram realizados de acordo com as Normas ISO série 30. Os resultados obtidos nos estudos de certificação da solução de condutividade eletrolítica, com sua respectiva incerteza expandida foram (5,00 +- 0,16) uS.cm(-1) a 25 °C, com k = 2, para um nível de confiança de aproximadamente 95%. A certificação deste material de referência primário contribuirá para a qualidade das medições de condutividade eletrolítica realizadas nos laboratórios do Brasil, o qual irá garantir a rastreabilidade metrológica dos resultados das medições, principalmente no monitoramento da pureza da água e da qualidade do álcool combustível.
Certified Reference Material (CRM) is a reference material, accompanied by documentation issued by an authoritative body and providing one or more specified property values with associated uncertainties and traceabilities, using valid procedures. The CRM are used for control and quality assurances of analytical results also are essential to calibrate instruments ensuring the traceability and reliability of results. Electrolytic conductivity is known as the ability of a solution to conduct electrical current. Traceability is a prerequisite for comparability and uniformity of measurements. In the case of electrolytic conductivity measurements on solutions, traceability is obtained through a primary system of conductivity that would lead to the CRM primary electrolytic conductivity. The CRM developed is low conductivity due to its importance in controlling the purity of water, raw material for production of medicines and vaccine, along with analysis of fuel ethanol. The studies of homogeneity, characterization and stability were carried out for CRM 5 uS.cm(-1) nominal value according to ISO 30 series. The result of studies of certification of the solution of electrolyte conductivity with its corresponding expanded uncertainty, with k = 2 for a confidence level (CL) of approximately 95% was (5.00 +- 0.16) uS.cm(-1) to 25 °C. The development and certification of primary reference material will contribute to the quality of the results of electrolytic conductivity measurements performed in various laboratories in Brazil and South America, which will guarantee the traceability and reliability of measurement results, especially when is related to monitoring purity of water and analysis of fuel ethanol.
APA, Harvard, Vancouver, ISO, and other styles
25

Obšilová, Lucie. "Měření malých stejnosměrných napětí." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-221185.

Full text
Abstract:
This thesis deals with low level DC voltage measurement by three methods. First part of this thesis describes each method. It is about potentiometric method, reference step method and direct method. This thesis also describes Josephson voltage standard which was used for calibration nanovoltmeter and Zener reference. The theoretical part of this thesis also deals with the evaluation of key comparison data. The main goal of this thesis is the comparison of methods used to measure low level DC voltage. The practical part of the thesis deals with the implementation of measurement with all methods in cooperation with Czech metrology institute. The measured values are processed including uncertainty evaluations. The final part of this thesis focuses on comparison of measurement methods. The key comparison reference value and the degree of equivalence of the measurement of each method are determined. Next part of comparison consisted of graphic comparison of methods. The end of the thesis contains evaluation of the achieved results.
APA, Harvard, Vancouver, ISO, and other styles
26

Englund, Sofia. "Verification of a method for sexual hormone-binding globulin analysis and estimation of free testosterone." Thesis, Uppsala universitet, Institutionen för medicinsk biokemi och mikrobiologi, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-176590.

Full text
Abstract:
Introduction: Sexual hormone-binding globulin (SHBG) is a protein that binds to androgens and oestrogens, especially testosterone. The fraction of testosterone that is not bound to SHBG is the biologically active fraction which makes its determination more relevant than determining the total amount of circulating testosterone. It is difficult to measure the plasma concentration of free testosterone; therefore calculations using the concentrations of testosterone and SHBG are used to estimate the amount of free testosterone. A few calculations include the concentration of albumin because testosterone also binds to albumin. The main aims of this study were to verify a method for the determination of SHBG and to calculate a reference interval for free androgen index (FAI, testosterone/SHBG) in women. Other calculations for determination of the free testosterone fraction were compared. Methods: Testosterone, SHBG and albumin were measured in serum from 20 men and 100 women. Testosterone and SHBG was measured using immunoassays on a Roche Modular E instrument (ECLIA). Albumin was measured with a c8000 Architect instrument. Four calculations, two with only testosterone and SHBG and two with testosterone, SHBG and albumin were compared.  Results/Conclusion: The verification of the SHBG method was successful which means that the method can be taken into routine use. A reference interval for FAI was constructed. It was difficult to show if other estimation of free testosterone would work better than FAI in clinical practice. This is discussed.
APA, Harvard, Vancouver, ISO, and other styles
27

Williams, S. M. "Similarity methods with reference to a high order nonlinear diffusion equation." Thesis, University of Oxford, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.238142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Wade, Edward O. Z. "New reconstructive methods in scientific illustration with reference to systematic herpetology." Thesis, Middlesex University, 2008. http://eprints.mdx.ac.uk/6276/.

Full text
Abstract:
The present work and papers published earlier by the author, together with a detailed introductory chapter, describe the work of scientific illustration at a specialized level and how the development of drawing techniques can contribute to an understanding of the morphology and systematics of snakes. This work has its roots in the background of the writer as a scientific illustrator. The early phase reflects the disciplines and influence of science, leading to involvement, first with fishes, then reptiles. A later phase arose from contact with the scientific staff of institutions, such as the natural History Museum London leading to an appreciation of, and participation in taxonomy. Illustration+visual recording of data, augmented by field experience, comprised the principal component in the research. Practical considerations directed the study towards snakes from Algeria. This work has resulted in seven published papers, most in collaboration with established scientists mainly on the herpetology of North Africa. A synopsis of each paper is provided. In some cases the writer's collaborators are from disciplines such as molecular sequencing and computer analysis. Traditional taxonomic characters are reviewed and new features are suggested to provide alternative approaches and applications. The species under investigation are viewed in the light of current practice in taxonomy and newly published evidence has been considered. Some related aspects are touched on: genetics, for example is not normally a matter for illustration but is of direct concern, as it is a parallel discipline in the investigation of interrelationships of taxa and thus cannot be disregarded. The illustrative techniques demonstrated here are inseparable from the recording of morphological data. Such recording requires prior perception of what is to be recorded and it is that interpretation which contributes. In processing of material, the experience of graphic recording of observations resulted in the acquisition of a degree of understanding, which was very useful in resolving taxonomic problems, and added an extra dimension. The contribution that graphic art has made to some problems in taxonomy is discussed. Specimens in a variety of conditions of preservation require a variety of approaches and techniques of illustration. It has been found that artistic input changed from being a purely descriptive record to a means by which, in conjunction with the more standard techniques, novel conclusions could be derived, thus demonstrating an original contribution to taxonomic problems.
APA, Harvard, Vancouver, ISO, and other styles
29

Miettunen, J. (Jouko). "Statistical methods in psychiatric research, with special reference on factor analysis." Doctoral thesis, University of Oulu, 2004. http://urn.fi/urn:isbn:9514273672.

Full text
Abstract:
Abstract This interdisciplinary study describes in the first part the frequency with which various statistical research designs and methods are reported in psychiatric journals, and investigates how the use of these methods affect the visibility of the article in the form of received citations. In the second part focus is specifically on factor analysis, and the study presents two applications of this method. Original research articles (N = 448) from four general psychiatric journals in 1996 were reviewed. The journals were the American Journal of Psychiatry, the Archives of General Psychiatry, the British Journal of Psychiatry and the Nordic Journal of Psychiatry. There were differences in the utilisation of statistical procedures among the journals. The use of statistical methods was not strongly associated with the further utilisation of an article. However, extended description of statistical procedures had a positive effect on the received citations. Factor analysis is a statistical method based on correlations of the variables, which is often used when validity and structure of psychiatric instruments are studied. Exploratory factor analysis is designed to explore underlying latent factors, and in confirmatory factor analysis the aim is to verify the factor structure based on earlier findings in other data sets. Using data from the 31-year follow-up of the Northern Finland 1966 Birth Cohort Study this study aimed to demonstrate the validity and factor structure of scales measuring temperament (Tridimensional Personality Questionnaire, TPQ, and Temperament and Character Inventory, TCI) and alexithymia (20-item Toronto Alexithymia Scale, TAS-20). The results of exploratory factor analysis indicated good performance of the TCI and TPQ, though the results suggested that some developmental work is still needed. Of the two scales, the TCI worked psychometrically better than the TPQ. A confirmatory factor analysis showed that the three-factor model of TAS-20 was in agreement with the Finnish version of the scale. To conclude, future authors of psychiatric journals might apply these results in designing their research to present intelligible and compact analysis combined with a high quality presentation technique. Results of the factor analyses showed that the TPQ, TCI and TAS-20 can be used also in their Finnish versions
Tiivistelmä Tämä poikkitieteellinen tutkimus kuvaa erilaisten tilastotieteellisten menetelmien yleisyyttä ja merkitystä psykiatriassa. Tutkimuksen ensimmäisessä osassa tutkitaan erilaisten tilastomenetelmien ja tutkimusasetelmien osuutta psykiatrisissa artikkeleissa ja lisäksi käytettyjen menetelmien vaikutusta artikkelien saamien viittausten lukumäärään. Tutkimuksen toisessa osassa keskitytään faktorianalyysiin ja esitetään kaksi siihen liittyvää sovellusta. Aineiston muodostavat alkuperäistuloksia esittelevät artikkelit (N = 448) neljästä eri psykiatrian tieteellisestä yleislehdestä vuodelta 1996. Kyseiset lehdet ovat American Journal of Psychiatry, Archives of General Psychiatry, British Journal of Psychiatry ja Nordic Journal of Psychiatry. Lehdet erosivat toisistaan tilastotieteellisten menetelmien käytössä ja tulosten esittämisessä. Tilastotieteellisten menetelmien käytöllä ei ollut suurta vaikutusta artikkelien saamien viittausten lukumäärään, mutta laajalla menetelmien kuvauksella oli positiivinen vaikutus viittausten lukumäärään. Faktorianalyysi on tilastotieteellinen tutkimusmenetelmä, jota käytetään tutkittaessa millaisista osatekijöistä erilaiset monimutkaiset ilmiöt koostuvat. Erityisesti tutkittaessa psykiatristen mittareiden validiteettia ja rakennetta faktorianalyysi on osoittautunut hyödylliseksi. Eksploratiivisessa faktorianalyysissa tarkoituksena on etsiä taustalla olevia piileviä muuttujia ja konfirmatorisessa faktorianalyysissa tarkoitus on vahvistaa aiemmissa tutkimuksissa todettu mittarin faktorirakenne. Tässä tutkimuksessa hyödynnetään aineistoa Pohjois-Suomen vuoden 1966 syntymäkohortin 31 vuoden seurannasta. Aineiston avulla tutkitaan temperamenttia (Tridimensional Personality Questionnaire, TPQ, ja Temperament and Character Inventory, TCI) ja aleksitymiaa (20-item Toronto Alexithymia Scale, TAS-20) tutkivien mittareiden suomenkielisten käännöksien validiteettia ja faktorirakennetta. Eksploratiivisen faktorianalyysin tulokset kertoivat, että TPQ ja TCI toimivat hyvin myös suomenkielellä. Kuitenkin mittareissa on vielä kehittämisen varaa. TCI:n psykometriset ominaisuudet olivat paremmat kuin TPQ:n. Aleksitymiamittarin TAS-20 konfirmatorinen faktorianalyysi osoitti että aiemmin julkaistu kolmen faktorin malli toimi hyvin myös suomalaisella versiolla. Psykiatristen artikkelien kirjoittajat voivat hyödyntää tämän tutkimuksen tuloksia suunnitellessaan psykiatrista tutkimusta suuntaan, jossa selkeä ja tiivis tulosten analysointitapa ja korkealaatuinen tulosten esitystapa korostuu. Faktorianalyysi soveltuu hyvin mittarin validiteetin tutkimiseen. Tutkimus osoitti TPQ-, TCI- ja TAS-20-mittareiden suomenkielisten versioiden validiteetin
APA, Harvard, Vancouver, ISO, and other styles
30

Oyedepo, Gbenga A. "The Multi-reference Correlation Consistent Composite Approach: A New Vista In Quantitative Prediction Of Thermochemical And Spectroscopic Properties." Thesis, University of North Texas, 2011. https://digital.library.unt.edu/ark:/67531/metadc103368/.

Full text
Abstract:
The multi-reference correlation consistent composite approach (MR-ccCA) was designed to reproduce the accuracy of more computationally intensive ab initio quantum mechanical methods like MR-ACPF-DK/aug-cc-pCV?Z-DK, albeit at a significantly reduced cost. In this dissertation, the development and applications of the MR-ccCA method and a variant of its single reference equivalent (the relativistic pseudopotential ccCA method) are reported. MR-ccCA is shown to predict the energetic properties of reactive intermediates, excited states species and transition states to within chemical accuracy (i.e. ±1.0 kcal mol 1) of reliable experimental values. The accuracy and versatility of MR-ccCA are also demonstrated in the prediction of the thermochemical and spectroscopic properties (such as atomization energies, enthalpies of formation and adiabatic transition energies of spin-forbidden excited states) of a series of silicon-containing compounds. The thermodynamic and kinetic feasibilities of the oxidative addition of an archetypal arylglycerol ?-aryl ether (?-O-4 linkage) substructure of lignin to Ni, Cu, Pd and Pt transition metal atoms using the efficient relativistic pseudopotential correlation consistent composite approach within an ONIOM framework (rp-ccCA-ONIOM), a multi-level multi-layer QM/QM method formulated to enhance the quantitative predictions of the chemical properties of heavy element-containing systems larger than hitherto attainable, are also reported.
APA, Harvard, Vancouver, ISO, and other styles
31

Kalaver, Satchidanand Anil. "Management of reference frames in simulation and its application to error reduction in numerical integration." Thesis, Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/12406.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Ford, Clellan Stearns. "An analysis of material culture (with special reference to Samoa) : a study in method /." Ann Arbor (Mich.) : UMI dissertations services, 2003. http://catalogue.bnf.fr/ark:/12148/cb39268883v.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

McConnell, David James. "Analysis of model referenced adaptive control applied to robotic devices." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/17099.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Ergen, Mehmet Kayra. "Application Of The Map Correlation Method To The Western Blacksea Basin." Master's thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614359/index.pdf.

Full text
Abstract:
Turkey is a developing country and its energy demand is increasing due to its growing population and industry. As a result, to fulfill this growing energy demand, Turkey is currently developing its unused hydropower potential, especially through small hydroelectric power plants (SHPPs). Estimation of annual electricity generation of a small hydropower plant strongly depends on streamflow data. In Turkey, there are a limited number of streamgaging stations so the estimation of streamflow at a potential SHPP location requires transferring streamflow time series from a reference streamgaging station to the ungaged basin. In order to determine daily streamflow time series for ungaged catchments, typically the nearest streamgaging station is chosen as the reference streamgaging station. However the distance between a reference streamgaging station and an ungaged catchment may not always be the most appropriate reference streamgaging station selection criterion. Archfield and Vogel (2010) proposed a new method called the Map Correlation Method (MCM) to select a reference streamgaging station to donate its observations to an ungaged catchment. MCM aims to identify the most correlated streamgaging station with the ungaged catchment. This new method is used at the Western Blacksea Basin in Turkey to select the best among possible reference streamgaging stations. The method proved to be promising
the most correlated streamgaging station for approximately one third of the study streamgaging stations are identified correctly by the MCM.
APA, Harvard, Vancouver, ISO, and other styles
35

Vadlamani, Ananth Kalyan. "Performance Improvement Methods for Terrain Database Integrity Monitors and Terrain Referenced Navigation." Ohio University / OhioLINK, 2004. http://www.ohiolink.edu/etd/view.cgi?ohiou1089742537.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Wåhlin, Pernilla. "Theoretical Actinide Chemistry – Methods and Models." Doctoral thesis, Stockholms universitet, Fysikum, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-54848.

Full text
Abstract:
The chemistry of actinides in aqueous solution is important, and it is essential to build adequate conceptual models and develop methods applicable for actinide systems. The complex electronic structure makes benchmarking necessary. In the thesis a prototype reaction of the water exchange reaction for uranyl(VI), for both ground and luminescent states, described with a six-water model, was used to study the applicability of density functional methods on actinides and different solvation models. An excellent agreement between the wave function methods CCSD(T) and MP2 was obtained in the ground state, implying that near-minimal CASPT2 can be used with confidence for the reaction in the luminescent state of uranyl(VI), while density functionals are not suited to describe energetics for this type of reaction. There was an ambiguity concerning the position of the waters in the second hydration sphere. This issue was resolved by investigating a larger model, and prop- erly used the six-water model was found to adequately describe the water exchange reaction. The effect of solvation was investigated by comparing the results from conductor-like polarizable continuum models using two cavity models. Scattered numbers made it difficult to determine which solvation model to use. The final conclusion was that the water exchange reaction in the luminescent state of uranyl(VI) should be addressed with near-minimal CASPT2 and a solvation model without explicit cavities for hydrogens. Finally it was shown that no new chemistry appears in the luminescent state for this reaction. The thesis includes a methodological investigation of a multi-reference density functional method based on a range separation of the two-electron interaction. The method depends on a universal parameter, which has been determined for lighter elements. It is shown here that the same parameter could be used for actinides, a prerequisite for further development of the method. The results are in that sense promising.
APA, Harvard, Vancouver, ISO, and other styles
37

Wallace, Carlington W. "Comparing Two Methods for Developing Local Sediment TMDLs to Address Benthic Impairments." Thesis, Virginia Tech, 2012. http://hdl.handle.net/10919/32345.

Full text
Abstract:
Excessive sedimentation is a leading cause of aquatic life use impairments in Virginia. As required by the Clean Water Act, a total maximum daily load (TMDL) must be developed for impaired waters. When developing a TMDL for aquatic life use impairment where sediment has been identified as the primary pollutant, the target sediment load is often determined using a non-impaired reference watershed, i.e., the reference watershed approach (RWA). The RWA has historically been used in Virginia to establish TMDL target sediment loads because there is no numeric ambient water quality criterion for sediment. The difference between the sediment load generated by the reference watershed and the load generated by the impaired watershed is used to determine the sediment load reduction required to meet the TMDL target load in the impaired watershed. Recent quantification of the Chesapeake Bay TMDL based on Phase 5.3 of the Chesapeake Bay Watershed Model (CBWM) offers a simpler and potentially more consistent method of calculating target sediment loads for impaired watersheds within the Chesapeake Bay watershed. Researchers in the Biological Systems Engineering department at Virginia Tech have developed the â disaggregate methodâ (DM) which uses landuse inputs to, and pollutant load outputs from, the CBWM to determine pollutant load reductions needed in watersheds whose areas are smaller than the smallest modeling segments generally used in the CBWM. The DM uses landuse-specific unit area loads from two CBWM model runs (an existing condition run and TMDL target load run) and a finer-scale, locally assessed landuse inventory to determine sediment loads. The DM is simpler and potential more consistent than the reference watershed approach. This study compared the reference watershed approach and the disaggregate method in terms of required sediment load reduction. Three sediment-impaired watersheds (Long Meadow Run, Taylor Creek and Turley Creek) within the Chesapeake Bay watershed were used for the study. Study results showed that the TMDL development method used to determine sediment loads would have noticeable effects on resulting sediment-load reduction requirements. For Taylor Creek, the RWA required 20.4 times greater reductions in sediment load (tons/yr) when compared to the DM. The RWA also required 9.2 and 10.4 times greater reductions for Turley Creek and Long Meadow Run watersheds, respectively. On a percentage basis, the RWA for reduction Taylor Creek was 7.3 times greater than that called for by the DM. The RWA called for 4.4 and 4.6 times greater percent reductions for Turley Creek and Long Meadow Run watersheds, respectively. An ancillary objective of this research was to compare the sediment load reductions required for the impaired and their respective RWA-reference watersheds, using the DM. This comparison revealed that, both Taylor Creek and Turley Creek watersheds required less sediment load reduction than their respective reference watersheds, while the load reductions required for Long Meadow Run were slightly greater than its reference watershed. There are several issues associated with either the RWA or the DM for developing sediment TMDLs. Those issues are discussed in detail. Recommendations the need for further studies, based in questions raised by the research presented here are also discussed.
Master of Science
APA, Harvard, Vancouver, ISO, and other styles
38

Pedge, Nicholas Ian. "Chemometric methods for the analysis of process spectra using minimal reference data." Thesis, University of Hull, 2008. http://hydra.hull.ac.uk/resources/hull:1747.

Full text
Abstract:
To construct a spectroscopic multivariate calibration model, a set of representative mixture spectra (independent variables) and the corresponding reference values for the property of interest (dependent variables) must be obtained. For a dynamic system such as a batch or semi-batch chemical reaction, creating such a data set may be very difficult or extremely time consuming. It may not be possible to create synthetic mixtures because reaction between the various reactants may occur. If the reaction proceeds via a reactive intermediate or affords a reactive product, isolated reference standards of those species may not be available. Reactions in industry are often heterogeneous and highly concentrated; sampling the batch throughout the course of the reaction for off-line analysis can be problematic and therefore introduce significant error into measured reference values. An alternative approach that combined Self-Modelling Curve Resolution (SMCR) methods and Partial Least Squares (PLS) to construct a quantitative model using only minimal reference data was implemented. The objective was to construct a quantitative calibration model to allow real-time in-situ UV/ATR measurements to be used to determine the end-point of a chlorination reaction. Difficult reaction sampling conditions and the absence of isolated reference standards for the product and reactive intermediate required the method to be developed using only a few key reference measurements.
APA, Harvard, Vancouver, ISO, and other styles
39

Bovill, Richard Alan. "Development of novel methods for microbial identification with special reference to staphylococci." Thesis, University of Newcastle Upon Tyne, 1991. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.293052.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Au, Tat-kwong Francis. "Spline finite strip method in the study of plates and shells with special reference to bridges /." [Hong Kong : University of Hong Kong], 1994. http://sunzi.lib.hku.hk/hkuto/record.jsp?B13787159.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Islam, Mustafa R. "A hypertext graph theory reference system." Virtual Press, 1993. http://liblink.bsu.edu/uhtbin/catkey/879844.

Full text
Abstract:
G-Net system is being developed by the members of the G-Net research group under the supervision of Dr. K. Jay Bagga. The principle objective of the G-Net system is to provide an integrated tool for dealing with various aspects of graph theory. G-Net system is divided into two parts. GETS (Graph theory Experiments Tool Set) will provide a set of tools to experiment with graph theory, and HYGRES (HYpertext Graph theory Reference Service), the second subcomponent of the G-Net system to aid graph theory study and research. In this research a hypertext application is built to present the graph theory concepts, graph models and the algorithms. In other words, HYGRES (Guide Version) provides the hypertext facilities for organizing a graph theory database in a very natural and interactive way. An hypertext application development tool, called Guide, is used to implement this version of HYGRES. This project integrates the existing version of GETS so that it can also provide important services to HYGRES. The motivation behind this project is to study the initial criterion for developing a hypertext system, which can be used for future development of a stand alone version of the G-Net system.
Department of Computer Science
APA, Harvard, Vancouver, ISO, and other styles
42

Van, Harmelen U. "The administration and organisation of independent study topics with special reference to secondary school geography." Thesis, Rhodes University, 1992. http://hdl.handle.net/10962/d1003300.

Full text
Abstract:
Traditional school subjects are having to compete for a place in a curriculum which is increasingly judged according to its perceived utilitarian value. According to current educational theory, geography's role in the curriculum is to develop concepts, skills, values and attitudes that allow pupils to understand the human and environmental issues which face their communities and communities throughout the world. In order to achieve these aims, teachers need to adopt a learner-centred teaching approach, yet geography teachers are faced with the dilemma of having to develop participatory teaching strategies within an existing structure which is largely product oriented. This thesis attempts to illustrate how changes can be effected in the approach to the teaching of geography, while working within existing syllabus constraints and while continuing to meet the demands made by the current examination system. To this end, Independent Study Topics are analysed as a means to bring about the desired changes in geographical education. The concept, Independent Study Topics as a 'blanket term' (Diepeveen, 1986) for pupil-centred activities is relatively recent in terms of the South African geography syllabus. In order to obtain greater clarity about the concept and its implications for geography teaching, this study examines current geographical theory relating to learner-centred approaches and relates them to teachers' perceptions of the role of IST in the geography curriculum. The second aspect of the study is concerned with the implementation of Independent Study Topics in a classroom research setting. The organisation and administration of Independent Study Topics in a single school setting is analysed and evaluated as a process of change. This analysis provides guidelines for developing a learner-centred approach which is necessary to ensure that geography retains its position in the school curriculum of the 1990's and beyond.
APA, Harvard, Vancouver, ISO, and other styles
43

Raquet, John F. "Development of a method for kinematic GPS carrier-phase ambiguity resolution using multiple reference receivers." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1998. http://www.collectionscanada.ca/obj/s4/f2/dsk2/tape17/PQDD_0027/NQ31068.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Manga, Martin. "Srovnávací analýza SIMO a MIMO metod experimentální modální analýzy." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2012. http://www.nusl.cz/ntk/nusl-230247.

Full text
Abstract:
Today represents vibration analysis an inseparable part of the product design, especially aeronautical components, machine tools etc. One of the vibration analysis methods is the so-called modal analysis, which determines the modal parameters of the researched structure. This paper deals with a comparison of two commonly used approaches, namely „Single Input Multiple Output“ (SIMO) and „Multiple Input Multiple Output“ analysis (MIMO). A MIMO procedure of measurement is developed and discussed. Both analyses are executed by the same conditions on the milling machine based on parallel kinematics in order to objective comparison. The results show that the choice of the so-called reference points is very important. In case both references are appropriately selected, the MIMO analysis gives better results that the SIMO one.
APA, Harvard, Vancouver, ISO, and other styles
45

Pacas, Carlos R. "The evaluation of PM2.5 measurements by Federal Reference Method (FRM) and Continuous instruments in Cincinnati, Ohio." University of Cincinnati / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1321641794.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Lui, Kwok-man Richard, and 呂國民. "Construction and testing of causal models in voting behaviour with reference to Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1996. http://hub.hku.hk/bib/B31235153.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Vuylsteke, Xavier. "Development of a reference method based on the fast multipole boundary element method for sound propagation problems in urban environments : formalism, improvements & applications." Thesis, Paris Est, 2014. http://www.theses.fr/2014PEST1174/document.

Full text
Abstract:
Décrit comme l'un des algorithmes les plus prometteurs du 20ème siècle, le formalisme multipolaire appliqué à la méthode des éléments de frontière, permet de nos jours de traiter de larges problèmes encore inconcevables il y a quelques années. La motivation de ce travail de thèse est d'évaluer la capacité, ainsi que les avantages concernant les ressources numériques, de ce formalisme pour apporter une solution de référence aux problèmes de propagation sonore tri-dimensionnels en environnement urbain, dans l'objectif d'améliorer les algorithmes plus rapides déjà existants. Nous présentons la théorie nécessaire à l'obtention de l'équation intégrale de frontière pour la résolution de problèmes non bornés. Nous discutons également de l'équation intégrale de frontière conventionnelle et hyper-singulière pour traiter les artefacts numériques liés aux fréquences fictives, lorsque l'on résout des problèmes extérieurs. Nous présentons par la suite un bref aperçu historique et technique du formalisme multipolaire rapide et des outils mathématiques requis pour représenter la solution élémentaire de l'équation de Helmholtz. Nous décrivons les principales étapes, d'un point de vue numérique, du calcul multipolaire. Un problème de propagation sonore dans un quartier, composé de 5 bâtiments, nous a permis de mettre en évidence des problèmes d'instabilités dans le calcul par récursion des matrices de translations, se traduisant par des discontinuités sur le champs de pression de surface et une non convergence du solveur. Ceci nous a conduits à considérer le travail très récent de Gumerov et Duraiswamy en lien avec un processus récursif stable pour le calcul des coefficients des matrices de rotation. Cette version améliorée a ensuite été testée avec succès sur un cas de multi diffraction jusqu'à une taille dimensionnelle de problème de 207 longueur d'ondes. Nous effectuons finalement une comparaison entre un algorithme d'élément de frontière, Micado3D, un algorithme multipolaire et un algorithme basé sur le tir de rayons, Icare, pour le calcul de niveaux de pression moyennés dans une cour ouverte et fermée. L'algorithme multipolaire permet de valider les résultats obtenus par tir de rayons dans la cour ouverte jusqu'à 300 Hz (i.e. 100 longueur d'ondes), tandis que concernant la cour fermée, zone très sensible par l'absence de contribution directes ou réfléchies, des études complémentaires sur le préconditionnement de la matrice semblent requises afin de s'assurer de la pertinence des résultats obtenus à l'aide de solveurs itératifs
Described as one of the best ten algorithms of the 20th century, the fast multipole formalism applied to the boundary element method allows to handle large problems which were inconceivable only a few years ago. Thus, the motivation of the present work is to assess the ability, as well as the benefits in term of computational resources provided by the application of this formalism to the boundary element method, for solving sound propagation problems and providing reference solutions, in three dimensional dense urban environments, in the aim of assessing or improving fast engineering tools. We first introduce the mathematical background required for the derivation of the boundary integral equation, for solving sound propagation problems in unbounded domains. We discuss the conventional and hyper-singular boundary integral equation to overcome the numerical artifact of fictitious eigen-frequencies, when solving exterior problems. We then make a brief historical and technical overview of the fast multipole principle and introduce the mathematical tools required to expand the elementary solution of the Helmholtz equation and describe the main steps, from a numerical viewpoint, of fast multipole calculations. A sound propagation problem in a city block made of 5 buildings allows us to highlight instabilities in the recursive computation of translation matrices, resulting in discontinuities of the surface pressure and a no convergence of the iterative solver. This observation leads us to consider the very recent work of Gumerov & Duraiswamy, related to a ``stable'' recursive computation of rotation matrices coefficients in the RCR decomposition. This new improved algorithm has been subsequently assessed successfully on a multi scattering problem up to a dimensionless domain size equal to 207 wavelengths. We finally performed comparisons between a BEM algorithm, extit{Micado3D}, the FMBEM algorithm and a ray tracing algorithm, Icare, for the calculation of averaged pressure levels in an opened and closed court yards. The fast multipole algorithm allowed to validate the results computed with Icare in the opened court yard up to 300 Hz corresponding, (i.e. 100 wavelengths), while in the closed court yard, a very sensitive area without direct or reflective fields, further investigations related to the preconditioning seem required to ensure reliable solutions provided by iterative solver based algorithms
APA, Harvard, Vancouver, ISO, and other styles
48

Yurk, Brian P. "Modeling the Evolution of Insect Phenology with Particular Reference to Mountain Pine Beetle." DigitalCommons@USU, 2009. https://digitalcommons.usu.edu/etd/385.

Full text
Abstract:
Climate change is likely to disrupt the timing of developmental events (phenology) in insect populations in which development time is largely determined by temperature. Shifting phenology puts insects at risk of being exposed to seasonal weather extremes during sensitive life stages and losing synchrony with biotic resources. Additionally, warming may result in loss of developmental synchronization within a population, making it difficult to find mates or mount mass attacks against well-defended resources at low population densities. It is unknown whether genetic evolution of development time can occur rapidly enough to moderate these effects. The work presented here is largely motivated by the need to understand how mountain pine beetle (MPB) populations will respond to climate change. MPB is an important forest pest from both an economic and ecological perspective, because MPB outbreaks often result in massive timber loss. Recent MPB range expansion and increased outbreak frequency have been linked to warming temperatures. We present a novel approach to modeling the evolution of phenology by allowing the parameters of a phenology model to evolve in response to selection on emergence time and density. We also develop a temperature-dependent phenology model for MPB that accounts for multiple types of developmental variation: variation that persists throughout a life stage, random variation, and variation due to the MPB oviposition mechanism. This model is parameterized using MPB development time data from constant temperature laboratory experiments. We use Laplace's method to approximate steady distributions of the evolution model under stable temperatures. Here the mean phenotype allows for parents and offspring to be oviposited at exactly the same time of year in consecutive generations. These results are verified numerically for both MPB and a two-stage model insect. The evolution model is also applied to investigate the evolution of phenology for MPB and the two-stage model insect under warming temperatures. The model predicts that local populations can only adapt to climate change if development time can adapt so that individuals can complete exactly one generation per year and if the rate of temperature change is moderate.
APA, Harvard, Vancouver, ISO, and other styles
49

Wang, Jiaqi. "The Impact of Computational Methods on Transition Metal-containing Species." Thesis, University of North Texas, 2015. https://digital.library.unt.edu/ark:/67531/metadc822795/.

Full text
Abstract:
Quantum chemistry methodologies can be used to address a wide variety of chemical problems. Key to the success of quantum chemistry methodologies, however, is the selection of suitable methodologies for specific problems of interest, which often requires significant assessment. To gauge a number of methodologies, the utility of density functionals (BLYP, B97D, TPSS, M06L, PBE0, B3LYP, M06, and TPSSh) in predicting reaction energetics was examined for model studies of C-O bond activation of methoxyethane and methanol. These species provide excellent representative examples of lignin degradation via C-O bond cleavage. PBE0, which performed better than other considered DFT functionals, was used to investigate late 3d (Fe, Co, and Ni), 4d (Ru, Rh, and Pd), and 5d (Re, Os, and Ir) transition metal atom mediated Cβ -O bond activation of the β–O–4 linkage of lignin. Additionally, the impact of the choice of DFT functionals, basis sets, implicit solvation models, and layered quantum chemical methods (i.e., ONIOM, Our Own N-layered Integrated molecular Orbital and molecular Mechanics) was investigated for the prediction of pKa for a set of Ni-group metal hydrides (M = Ni, Pd, and Pt) in acetonitrile. These investigations have provided insight about the utility of a number of theoretical methods in the computation of thermodynamic properties of transition metal hydrides in solution. As single reference wavefunction methods commonly perform poorly in describing molecular systems that involve bond-breaking and forming or electronic near-degeneracies and are typically best described with computationally costly multireference wavefunction-based methods, it is imperative to a priori analyze the multireference character for molecular systems so that the proper methodology choice is applied. In this work, diagnostic criteria for assessing the multireference character of 4d transition metal-containing molecules was investigated. Four diagnostics were considered in this work, including the weight of the leading configuration of the CASSCF wavefunction, C02; T1, the Frobenius norm of the coupled cluster amplitude vector related to single excitations and D1, the matrix norm of the coupled cluster amplitude vector arising from coupled cluster calculations; and the percent total atomization energy, %TAE. This work demonstrated the need to have different diagnostic criteria for 4d molecules than for main group molecules.
APA, Harvard, Vancouver, ISO, and other styles
50

Huh, Ji Young. "Applications of Monte Carlo Methods in Statistical Inference Using Regression Analysis." Scholarship @ Claremont, 2015. http://scholarship.claremont.edu/cmc_theses/1160.

Full text
Abstract:
This paper studies the use of Monte Carlo simulation techniques in the field of econometrics, specifically statistical inference. First, I examine several estimators by deriving properties explicitly and generate their distributions through simulations. Here, simulations are used to illustrate and support the analytical results. Then, I look at test statistics where derivations are costly because of the sensitivity of their critical values to the data generating processes. Simulations here establish significance and necessity for drawing statistical inference. Overall, the paper examines when and how simulations are needed in studying econometric theories.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography