To see the other types of publications on this topic, follow the link: Modern methods of construction.

Dissertations / Theses on the topic 'Modern methods of construction'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Modern methods of construction.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Sanna, Fausto. "Timber modern methods of construction : a comparative study." Thesis, Edinburgh Napier University, 2018. http://researchrepository.napier.ac.uk/Output/1256099.

Full text
Abstract:
The doctoral research revolves around a comparative study of timber modern methods of construction for low-rise, residential buildings in Scotland. The building techniques studied involve both timber-frame panel construction (open-panel and closed-panel systems and structural insulated panels) and massive-timber construction (cross-laminated and nail-laminated timber panels). A non-timber technique is also included in the study: more traditional, load-bearing masonry (blockwork). These different building techniques have been analysed from two complementary aspects: environmental impacts and thermal performance. The environmental study is based on the life-cycle assessment methodology and embraces various aspects: environmental impacts (e.g., climate change, acidification, eutrophication, ozone depletion, etc.), consumption of energy (renewable and non-renewable resources) and production of waste (from non-hazardous to radioactive). The assessment takes a cradle-to-gate approach and, in its structure and method, is informed by the current recommendations of the international standards in the field (i.e., ISO 14040 series). Various environmental trade-offs between construction methods have been identified. In terms of global-warming potential (excluding biogenic carbon sequestration), results suggest that timber-frame buildings show a better performance than masonry buildings; this is particularly true for the open-panel system, which emits about 10% less carbon than the masonry counterpart. Massive-timber buildings tend to cause more carbon emissions than masonry ones. In terms of consumption of non-renewable primary energy, timber buildings do not generally show significant advantages with respect to blockwork-based masonry. In particular, structural-insulated panel systems tend to show very high energy requirements. Timber-based buildings show a tendency to cause increased acidification, eutrophication and creation of low ozone than their masonry counterpart. The level of offsite fabrication that is employed for the erection of the buildings plays an important role in the magnitude of most environmental impacts, which show an average decrease between 5% and 10% when some of the operations are shifted from the construction site to the factory. v The thermal study investigates the performance of the building envelope, and, in particular, of external walls, by means of tests whereby the thermal behaviour of a sample of walls (of full-size section) has been observed and measured over time. On the outside, the walls were exposed to real, natural weather variations throughout the summer. The study especially focuses on the time-dependent response of three different walling systems (which results from their individual cross-sectional arrangements of building components and the associated combination of heat-storage capacity and thermal resistance): a timber-framed wall, a cross-laminated-timber wall and a masonry wall. Thus, the main goal of the study was to characterise the thermal-inertia parameters of these walls. This type of thermal behaviour is related to the repercussions of global climate change at UK level, especially in terms of increase in solar irradiance and temperature, which requires an adaptation of the building-envelope such that it can perform well both during wintertime and summertime, by providing maximum indoor comfort with minimum economic and environmental costs from the construction and operation of buildings. The timber-framed wall possesses the greatest capacity to slow down the propagation of temperature waves from the outer surface to the inner surface (time lag), whereas the masonry wall performs best with respect to reducing the amplitude of temperature oscillation on the inner surface (decrement factor). The cross-laminated-timber wall exhibits intermediate values of both time lag and decrement factor, relative to the other two walls. Both the thermal and life-cycle assessment of the construction alternatives aim at assisting the design and decision-making process in the residential field and at suggesting areas that need to be addressed and improved, towards a coherent evolution of the building techniques included in this study and a step forward in the realisation of sustainable, low-rise dwellings.
APA, Harvard, Vancouver, ISO, and other styles
2

Hashemi, Arman. "Construction technology transfer : an assessment of the relevance of modern methods of construction to housing shortages in Iran." Thesis, Cardiff University, 2009. http://orca.cf.ac.uk/55463/.

Full text
Abstract:
The inability of the Iranian construction industry to satisfy the country's massive housing demand has transformed housing demand and supply into one of the major challenges facing the government. 1.15 million residential units need to be built each year for the next ten years. The Iranian construction industry is suffering from various deficiencies such as low productivity, small and unprofessional developers, huge waste, skilled labour shortages, defective management, unstable economy, severe fluctuations in demand and supply etc. Considering the potential advantages of Modern Methods of Construction (MMC), the general belief is that the application of MMC will resolve may of the above issues. Meanwhile, Iran needs to learn from the experience of other countries such as the UK to avoid repeating their mistakes. MMC is a more complex subject in which various issues including standardisation, coordination, management, design, costs, sustainability, risks, etc, should be considered. Some of the above have become more important than others for Iranian stakeholders but prioritisation and partial consideration of these issues will not be effective. This study intends to investigate the viability and applicability of the UK's advanced construction systems in Iran. For this reason, several criteria including the building regulations and standards, practicality, economy, costs, culture, sustainability, and design have been addressed, and both countries compared with regards to these issues. The results show that, although MMC can theoretically enhance the current situation of the construction industry, issues such as education and research, industry, economy etc., need to be addressed in order to have successful application of MMC in Iran.
APA, Harvard, Vancouver, ISO, and other styles
3

Wells, Lawrence E. "Construction Applications, Practices, and Techniques of Natural Trumpets: A Comparative Analysis of Baroque and Modern Era Natural Trumpet Construction Methods." Thesis, connect to online resource. Access restricted to the University of North Texas campus, 2006. http://www.unt.edu/etd/all/Dec2006/Restricted/wells_lawrence_e/index.htm.

Full text
Abstract:
Thesis (D.M.A.)--University of North Texas, 2006.<br>System requirements: Adobe Acrobat Reader. Accompanied by 4 recitals, recorded May 31, 2004, June 6, 2005, Feb. 20, 2006, and June 12, 2006. Includes bibliographical references (p. 65-67).
APA, Harvard, Vancouver, ISO, and other styles
4

Lui, Kwok-man Richard, and 呂國民. "Construction and testing of causal models in voting behaviour with reference to Hong Kong." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1996. http://hub.hku.hk/bib/B31235153.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Lannerhed, Petter. "Structural Diagnosis Implementation of Dymola Models using Matlab Fault Diagnosis Toolbox." Thesis, Linköpings universitet, Fordonssystem, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-138753.

Full text
Abstract:
Models are of great interest in many fields of engineering as they enable prediction of a systems behaviour, given an initial mode of the system. However, in the field of model-based diagnosis the models are used in a reverse manner, as they are combined with the observations of the systems behaviour in order to estimate the system mode. This thesis describes computation of diagnostic systems based on models implemented in Dymola. Dymola is a program that uses the language Modelica. The Dymola models are translated to Matlab, where an application called Fault Diagnosis Toolbox, FDT is applied. The FDT has functionality for pinpointing minimal overdetermined sets of equations, MSOs, which is developed further in this thesis. It is shown that the implemented algorithm has exponential time complexity with regards to what level the system is overdetermined,also known as the degree of redundancy. The MSOs are used to generate residuals, which are functions that are equal to zero given that the system is fault-free. Residual generation in Dymola is added to the original methods of the FDT andthe results of the Dymola methods are compared to the original FDT methods, when given identical data. Based on these tests it is concluded that adding the Dymola methods to the FDT results in higher accuracy, as well as a new way tocompute optimal observer gain. The FDT methods are applied to 2 models, one model is based on a system ofJAS 39 Gripen; SECS, which stands for Secondary Enviromental Control System. Also, applications are made on a simpler model; a Two Tank System. It is validated that the computational properties of the developed methods in Dymolaand Matlab differs and that it therefore exists benefits of adding the Dymola implementations to the current FDT methods. Furthermore, the investigation of the potential isolability based on the current setup of sensors in SECS shows that full isolability is achievable by adding 2 mass flow sensors, and that the isolability is not limited by causality constraints. One of the found MSOs is solvable in Dymola when given data from a fault-free simulation. However, if the simulation is not fault-free, the same MSO results in a singular equation system. By utilizing MSOs that had no reaction to any modelled faults, certain non-monitored faults is isolated from the monitored ones and therefore the risk of false alarms is reduced. Some residuals are generated as observers, and a new method for constructing observers is found during the thesis by using Lannerheds theorem in combination with Pontryagin’s Minimum Priniple. This method enables evaluation of observer based residuals in Dymola without any selection of a specific operating point, as well as evaluation of observers based on high-index Differential Algebraic Equations, DAEs. The method also results in completely different behaviourof the estimation error compared to the method that is already implemented inthe FDT. For example, one of the new observer-implementations achieves both an estimation error that converges faster towards zero when no faults are implementedin the monitored system, and a sharper reaction to implemented faults.
APA, Harvard, Vancouver, ISO, and other styles
6

Ahmed, Hasim Abdalla Obaid. "Construction and analysis of efficient numerical methods to solve Mathematical models of TB and HIV co-infection." Thesis, University of the Western Cape, 2011. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_3704_1325661141.

Full text
Abstract:
In this thesis, we study these models and design and analyze robust numerical methods to solve them. To proceed in this direction, first we study the sub-models and then the full model. The first sub-model describes the transmission dynamics of HIV that accounts for behavior change. The impact of HIV educational campaigns is also studied. Further, we explore the effects of behavior change and different responses of individuals to educational campaigns in a situation where individuals may not react immediately to these campaigns. This is done by considering a distributed time delay in the HIV sub-model. This leads to Hopf bifurcations around the endemic equilibria of the model. These bifurcations correspond to the existence of periodic solutions that oscillate around the equilibria at given thresholds. Further, we show how the delay can result in more HIV infections causing more increase in the HIV prevalence. Part of this study is then extended to study a co-infection model of HIV-TB. A thorough bifurcation analysis is carried out for this model. Robust numerical methods are then designed and analyzed for these models. Comparative numerical results are also provided for each model.
APA, Harvard, Vancouver, ISO, and other styles
7

Mavromatis, Theodoros. "Impact of different methods of climate change scenario construction on the yield distributions of winter wheat using crop growth simulation models." Thesis, University of East Anglia, 1996. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.361486.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Mueller, Ralph. "Specification and Automatic Generation of Simulation Models with Applications in Semiconductor Manufacturing." Diss., Georgia Institute of Technology, 2007. http://hdl.handle.net/1853/16147.

Full text
Abstract:
The creation of large-scale simulation models is a difficult and time-consuming task. Yet simulation is one of the techniques most frequently used by practitioners in Operations Research and Industrial Engineering, as it is less limited by modeling assumptions than many analytical methods. The effective generation of simulation models is an important challenge. Due to the rapid increase in computing power, it is possible to simulate significantly larger systems than in the past. However, the verification and validation of these large-scale simulations is typically a very challenging task. This thesis introduces a simulation framework that can generate a large variety of manufacturing simulation models. These models have to be described with a simulation data specification. This specification is then used to generate a simulation model which is described as a Petri net. This approach reduces the effort of model verification. The proposed Petri net data structure has extensions for time and token priorities. Since it builds on existing theory for classical Petri nets, it is possible to make certain assertions about the behavior of the generated simulation model. The elements of the proposed framework and the simulation execution mechanism are described in detail. Measures of complexity for simulation models that are built with the framework are also developed. The applicability of the framework to real-world systems is demonstrated by means of a semiconductor manufacturing system simulation model.
APA, Harvard, Vancouver, ISO, and other styles
9

Yang, Jin Rong. "The Application of Fuzzy Logic and Virtual Reality in the Study of Ancient Methods and Materials Used for the Construction of the Great Wall of China in Jinshanling." The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu152410262072719.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hanahata, H. "Fundamental analysis of production methods for polyurethaneureas : Kinetics studies of the formation of polyurethaneureas in solution and the construction and evaluation of deterministic and stochastic computer models for real-time computer-control." Thesis, University of Bradford, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.384273.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Mazzoni, Christophe. "Construction d'un modele numerique de terrain : methodes et parallelisation." Paris 11, 1995. http://www.theses.fr/1995PA112220.

Full text
Abstract:
Le but de ce travail est de reduire le temps de traitement necessaire a la production d'un modele numerique de terrain (mnt) en utilisant un calculateur parallele. C'est une collaboration entre l'institut geographique national (ign) et le laboratoire d'electronique de technique et d'instrumentation (leti) du commissariat a l'energie atomique francais. L'ign a developpe un outil de production de mnt qui sont utilises en cartographie. L'element cle du systeme est un correlateur, programme qui determine automatiquement les paires de points homologues issus du couple stereoscopique traite. Ce traitement requiert une importante puissance de calcul qu'un calculateur parallele de type simd (single instruction, multiple data) comme sympati-2 developpe au leti peut fournir. Ce calculateur est integre et commercialise dans le systeme de vision openvision. Tout en conservant la philosophie du programme sequentielle, nous avons cherche a exploiter le parallelisme de donnees du programme. Notre analyse a mis en evidence un point bloquant important lie a la faiblesse du couplage entre le processeur scalaire d'openvision et le calculateur parallele. Nous proposons des solutions pour renforcer le couplage scalaire-parallele. Pour accelerer davantage le traitement nous avons evalue l'apport procure par symphonie, calculateur simd, successeur de sympati-2. D'autre part nous avons elabore une approche multi-correlateurs qui repose sur une parallelisation de type parallelisme de controle, pour lequel une structure parallele mimd (multiple instruction, multiple data) est adaptee. Enfin, nous dressons les grandes lignes d'une architecture de type multi-simd qui concilie nos deux approches paralleles. Cette architecture offre la capacite d'apprehender tous les niveaux de traitement d'images par le couplage fort entre le processeur scalaire et le processeur parallele simd. Elle est flexible par sa modularite, et le reseau de communication lui assure une fiabilite qui interesse les systemes sensibles
APA, Harvard, Vancouver, ISO, and other styles
12

Garcia, Fuentes Josep M. (Josep Maria). "La construcció del Montserrat modern = The construction of the modern Montserrat." Doctoral thesis, Universitat Politècnica de Catalunya, 2012. http://hdl.handle.net/10803/127349.

Full text
Abstract:
This PhD Dissertation studies the process of construction of the modern Montserrat after the Napoleonic destruction of the old shrine-monastery and its disendowment in 1836, and shows how this process was not the reconstruction or restoration of the old destroyed monastery but the construction, or the "invention", of a new one. This process was initiated in 1844 when the shrine was reopened and was developed until the first half of the 20th century in parallel to the definition of the contemporary Catalonia and Spain. Therefore, and understanding the modern Montserrat is the result of the complex interaction between all the agents who are implied in its construction process, the dissertation addresses in eighteenth chapters the main themes and the individuals and groups who intervened in it. These eighteen chapters are in turn structured into five groups according to the major alterations in the equilibrium of influences between the agents involved in the process at each moment. The first group of chapters specify the ¿architectural, political and social- problems of the first works, as well as the decisive intervention of Victor Balaguer and his attempt to define a Montserrat symbol "for all" within the wide national and federal symbolic universe he created; an interpretation based on the romantic approach to the mountain defined in good part during Humboldt's visit in 1800. Balaguer's intervention was centered in the valuation of gothic architecture and in the first attempts to define an architectural project; but it was frustrated with the end of the reign of Amadeo I, and the federal dream. It was then when was defined the first culture of the catalanism, and when the "group of Vic" take a step and alter, with their "patriotic-religious campaigns", Balaguer's symbolic construction to their own advantage, what carried out the definition and valuation of Romanesque architecture, as well as the definitive popularization of the mountain. The second group of chapters, then, studies the intense symbolic construction of the mountain that takes place during the end of the 19th century as the consequence of the tensions generated by the campaigns of the "group of Vic". The construction of the rack-and-pinion railway, the rosary monuments, the panorama in the Exposition of 1888, the architecture of Gaudi and others, or the popular reproductions of the mountain, are some of the cases analyzed. The architectural consequences that this symbolic construction had over the construction process are analyzed in the third group of chapters that culminates with the general project defined by Puig i Cadafalch. That incorporates the unique mountain as an element of project and it goes well beyond the dilemma on styles. But despite the construction of this project was initiated, it was never completed due to the Spanish Civil War. The fourth group of chapters is only one chapter that makes a brief digression on Montserrat and the Western modern cultural construction of the mountain together with its relation with the 18th, 19th, and 20th centuries' architecture. The fifth and last group of chapters returns to the chronological order and studies the process of construction during the first years of Franco's dictatorship, as well as the projects that were done then. And that failed due to the changes that took place after the "Enthronement festivities" of 1947; and that provoked the definitive wreck of the architectural construction of the modern Montserrat. Finally, the epilogue raises some of the most important questions the dissertation makes evident, emphasizing the importance that tourism and modern mass mediums had during all over the process, and how Montserrat, in a wider context, is a paradigmatic example of modern heritage-making processes. The annex contains the cataloguing ¿by the author- of all the documents forming the "Architectural Archive of Montserrat" and that, together with other materials, are analyzed in the dissertation.
APA, Harvard, Vancouver, ISO, and other styles
13

Olsson, Johan. "Modern methods in cereal grain mycology /." Uppsala : Swedish Univ. of Agricultural Sciences (Sveriges lantbruksuniv.), 2000. http://epsilon.slu.se/avh/2000/91-576-5792-0.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Wade, Adrian Paul. "Modern mathematical methods in analytical chemistry." Thesis, Swansea University, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.329720.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Kaya, Ahmet. "Modern mathematical methods for actuarial sciences." Thesis, University of Leicester, 2017. http://hdl.handle.net/2381/39613.

Full text
Abstract:
In the ruin theory, premium income and outgoing claims play an important role. We introduce several ruin type mathematical models and apply various mathematical methods to find optimal premium price for the insurance companies. Quantum theory is one of the significant novel approaches to compute the finite time non-ruin probability. More exactly, we apply the discrete space Quantum mechanics formalism (see main thesis for formalism) and continuous space Quantum mechanics formalism (see main thesis for formalism) with the appropriately chosen Hamiltonians. Several particular examples are treated via the traditional basis and quantum mechanics formalism with the different eigenvector basis. The numerical results are also obtained using the path calculation method and compared with the stochastic modeling results. In addition, we also construct various models with interest rate. For these models, optimal premium prices are stochastically calculated for independent and dependent claims with different dependence levels by using the Frank copula method.
APA, Harvard, Vancouver, ISO, and other styles
16

Peavoy, Daniel. "Methods of likelihood based inference for constructing stochastic climate models." Thesis, University of Warwick, 2012. http://wrap.warwick.ac.uk/58997/.

Full text
Abstract:
This thesis is about the construction of low dimensional diffusion models of climate variables. It assesses the predictive skill of models derived from a principled averaging procedure and a purely empirical approach. The averaging procedure starts from the equations for the original system then approximates the \weather" variables by a stochastic process. They are then averaged with respect to their invariant measure. This assumes that they equilibriate much faster than the climate variables. The empirical approach argues for a very general model form, then parameters are estimated using likelihood based inference for Stochastic Differential Equations. This is computationally demanding and relies upon Markov Chain Monte Carlo methods. A large part of this thesis is focused upon techniques to improve the efficiency of these algorithms. The empirical approach works well on simple one dimensional models but performs poorly on multivariate problems due to the rapid increase in unknown parameters. The averaging procedure is skillful in multivariate problems but is sensitive to lack of complete time scale separation in the system. In conclusion, the averaging procedure is better and can be improved by estimating parameters in a principled way based on the likelihood function and by including a latent noise process in the model.
APA, Harvard, Vancouver, ISO, and other styles
17

Buciuni, Francesco. "Applications of modern methods for scattering amplitudes." Thesis, Durham University, 2018. http://etheses.dur.ac.uk/12732/.

Full text
Abstract:
The large amount of new high energy data being collected by the LHC experiments has the potential to provide new information about the nature of the fundamental forces through precision comparisons with the Standard Model. These precision measurements require intensive perturbative scattering amplitude computations with large multiplicity final states. In this thesis we develop new on-shell methods for the analytic computation of scattering amplitudes in QCD which offer improved evaluation speed and numerical stability over currently available techniques and also allow us to explore the structure of amplitudes in gauge theories. We apply these techniques to extract compact analytic expression for the triple collinear splitting functions at one-loop in QCD and supersymmetric gauge theories which contribute to the universal factorisation at N${}^3$LO. We also investigate improvements to dimensionally regulated one-loop amplitude computations by combining the six-dimensional spinor helicity formalism and a momentum twistor parameterisation with the integrand reduction and generalised unitarity methods. This allowed the development of a completely algebraic approach to the computation of dimensionally regulated amplitudes in QCD including massive fermions. We present applications to Higgs plus five-gluon scattering in the large top mass limit and top pair production with up to three partons. In the case of massive one-loop amplitudes we present a new approach to the problem of wave-function renormalisation which only requires gauge invariant, on-shell building blocks. Massive one-loop amplitudes contain information that cannot be extracted from unregulated cuts, the new approach instead constrains the amplitudes using the universal poles in $6-2\eps$ dimensions which can be computed from an effective Lagrangian on dimension six operators.
APA, Harvard, Vancouver, ISO, and other styles
18

Paulson, Joel Anthony. "Modern control methods for chemical process systems." Thesis, Massachusetts Institute of Technology, 2017. http://hdl.handle.net/1721.1/109672.

Full text
Abstract:
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Chemical Engineering, 2017.<br>Cataloged from PDF version of thesis.<br>Includes bibliographical references (pages 301-322).<br>Strong trends in chemical engineering have led to increased complexity in plant design and operation, which has driven the demand for improved control techniques and methodologies. Improved control directly leads to smaller usage of resources, increased productivity, improved safety, and reduced pollution. Model predictive control (MPC) is the most advanced control technology widely practiced in industry. This technology, initially developed in the chemical engineering field in the 1970s, was a major advance over earlier multivariable control methods due to its ability to seamlessly handle constraints. However, limitations in industrial MPC technology spurred significant research over the past two to three decades in the search of increased capability. For these advancements to be widely implemented in industry, they must adequately address all of the issues associated with control design while meeting all of the control system requirements including: -- The controller must be insensitive to uncertainties including disturbances and unknown parameter values. -- The controlled system must perform well under input, actuator, and state constraints. -- The controller should be able to handle a large number of interacting variables efficiently as well as nonlinear process dynamics. -- The controlled system must be safe, reliable, and easy to maintain in the presence of system failures/faults. This thesis presents a framework for addressing these problems in a unified manner. Uncertainties and constraints are handled by extending current state-of-the-art MPC methods to handle probabilistic uncertainty descriptions for the unknown parameters and disturbances. Sensor and actuator failures (at the regulatory layer) are handled using a specific internal model control structure that allows for the regulatory control layer to perform optimally whenever one or more controllers is taken offline due to failures. Non-obvious faults, that may lead to catastrophic system failure if not detected early, are handled using a model-based active fault diagnosis method, which is also able to cope with constraints and uncertainties. These approaches are demonstrated on industrially relevant examples including crystallization and bioreactor processes.<br>by Joel Anthony Paulson.<br>Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
19

Zajíc, Jiří. "Modern Methods for Tree Graph Structures Rendering." Master's thesis, Vysoké učení technické v Brně. Fakulta informačních technologií, 2013. http://www.nusl.cz/ntk/nusl-412891.

Full text
Abstract:
Tento projekt se věnuje problematice zobrazení velkých hierarchických struktur, zejména možnostem vizualizace stromových grafů. Cílem je implementace hyperbolického prohlížeče ve webovém prostředí, který využívá potenciálu neeukleidovské geometrie k promítnutí stromu na hyperbolickou rovinu. Velký důraz je kladen na uživatelsky přívětivou manipulaci se zobrazovaným modelem a snadnou orientaci.
APA, Harvard, Vancouver, ISO, and other styles
20

Reddi, Sashank Jakkam. "New Optimization Methods for Modern Machine Learning." Research Showcase @ CMU, 2017. http://repository.cmu.edu/dissertations/1116.

Full text
Abstract:
Modern machine learning systems pose several new statistical, scalability, privacy and ethical challenges. With the advent of massive datasets and increasingly complex tasks, scalability has especially become a critical issue in these systems. In this thesis, we focus on fundamental challenges related to scalability, such as computational and communication efficiency, in modern machine learning applications. The underlying central message of this thesis is that classical statistical thinking leads to highly effective optimization methods for modern big data applications. The first part of the thesis investigates optimization methods for solving large-scale nonconvex Empirical Risk Minimization (ERM) problems. Such problems have surged into prominence, notably through deep learning, and have led to exciting progress. However, our understanding of optimization methods suitable for these problems is still very limited. We develop and analyze a new line of optimization methods for nonconvex ERM problems, based on the principle of variance reduction. We show that our methods exhibit fast convergence to stationary points and improve the state-of-the-art in several nonconvex ERM settings, including nonsmooth and constrained ERM. Using similar principles, we also develop novel optimization methods that provably converge to second-order stationary points. Finally, we show that the key principles behind our methods can be generalized to overcome challenges in other important problems such as Bayesian inference. The second part of the thesis studies two critical aspects of modern distributed machine learning systems — asynchronicity and communication efficiency of optimization methods. We study various asynchronous stochastic algorithms with fast convergence for convex ERM problems and show that these methods achieve near-linear speedups in sparse settings common to machine learning. Another key factor governing the overall performance of a distributed system is its communication efficiency. Traditional optimization algorithms used in machine learning are often ill-suited for distributed environments with high communication cost. To address this issue, we dis- cuss two different paradigms to achieve communication efficiency of algorithms in distributed environments and explore new algorithms with better communication complexity.
APA, Harvard, Vancouver, ISO, and other styles
21

Ren, Zhiwei. "Portfolio Construction using Clustering Methods." Link to electronic thesis, 2005. http://www.wpi.edu/Pubs/ETD/Available/etd-042605-092010/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

DeSouza, Chelsea E. "The Greek Method of Exhaustion: Leading the Way to Modern Integration." The Ohio State University, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=osu1338326658.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Barbarroux, Loïc. "Contributions à la modélisation multi-échelles de la réponse immunitaire T-CD8 : construction, analyse, simulation et calibration de modèles." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSEC026/document.

Full text
Abstract:
Lors de l’infection par un pathogène intracellulaire, l’organisme déclenche une réponse immunitaire spécifique dont les acteurs principaux sont les lymphocytes T-CD8. Ces cellules sont responsables de l’éradication de ce type d’infections et de la constitution du répertoire immunitaire de l’individu. Les processus qui composent la réponse immunitaire se répartissent sur plusieurs échelles physiques inter-connectées (échelle intracellulaire, échelle d’une cellule, échelle de la population de cellules). La réponse immunitaire est donc un processus complexe, pour lequel il est difficile d’observer ou de mesurer les liens entre les différents phénomènes mis en jeu. Nous proposons trois modèles mathématiques multi-échelles de la réponse immunitaire, construits avec des formalismes différents mais liés par une même idée : faire dépendre le comportement des cellules TCD8 de leur contenu intracellulaire. Pour chaque modèle, nous présentons, si possible, sa construction à partir des hypothèses biologiques sélectionnées, son étude mathématique et la capacité du modèle à reproduire la réponse immunitaire au travers de simulations numériques. Les modèles que nous proposons reproduisent qualitativement et quantitativement la réponse immunitaire T-CD8 et constituent ainsi de bons outils préliminaires pour la compréhension de ce phénomène biologique<br>Upon infection by an intracellular pathogen, the organism triggers a specific immune response,mainly driven by the CD8 T cells. These cells are responsible for the eradication of this type of infections and the constitution of the immune repertoire of the individual. The immune response is constituted by many processes which act over several interconnected physical scales (intracellular scale, single cell scale, cell population scale). This biological phenomenon is therefore a complex process, for which it is difficult to observe or measure the links between the different processes involved. We propose three multiscale mathematical models of the CD8 immune response, built with different formalisms but related by the same idea : to make the behavior of the CD8 T cells depend on their intracellular content. For each model, we present, if possible, its construction process based on selected biological hypothesis, its mathematical study and its ability to reproduce the immune response using numerical simulations. The models we propose succesfully reproduce qualitatively and quantitatively the CD8 immune response and thus constitute useful tools to further investigate this biological phenomenon
APA, Harvard, Vancouver, ISO, and other styles
24

Viglundsson, Viglundur Thor. "Modern fleet planning methods for ocean liner service." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/35008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Ljungkvist, Karl. "Techniques for finite element methods on modern processors." Licentiate thesis, Uppsala universitet, Avdelningen för beräkningsvetenskap, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-242186.

Full text
Abstract:
In this thesis, methods for efficient utilization of modern computer hardware for numerical simulation are considered. In particular, we study techniques for speeding up the execution of finite-element methods. One of the greatest challenges in finite-element computation is how to efficiently perform the the system matrix assembly efficiently in parallel, due to its complicated memory access pattern. The main difficulty lies in the fact that many entries of the matrix are being updated concurrently by several parallel threads. We consider transactional memory, an exotic hardware feature for concurrent update of shared variables, and conduct benchmarks on a prototype processor supporting it. Our experiments show that transactions can both simplify programming and provide good performance for concurrent updates of floating point data. Furthermore, we study a matrix-free approach to finite-element computation which avoids the matrix assembly. Motivated by its computational properties, we implement the matrix-free method for execution on graphics processors, using either atomic updates or a mesh coloring approach to handle the concurrent updates. A performance study shows that on the GPU, the matrix-free method is faster than a matrix-based implementation for many element types, and allows for solution of considerably larger problems. This suggests that the matrix-free method can speed up execution of large realistic simulations.<br>UPMARC<br>eSSENCE
APA, Harvard, Vancouver, ISO, and other styles
26

Obal, Walter Douglas 1966. "Measure-adaptive state-space construction methods." Diss., The University of Arizona, 1998. http://hdl.handle.net/10150/288893.

Full text
Abstract:
Much work has been done on the problem of stochastic modeling for the evaluation of performance, dependability and performability properties of systems, but little attention has been given to the interplay between the model and the performance measure of interest. Our work addresses the problem of automatically constructing Markov processes tailored to the structure of the system and the nature of the performance measures of interest. To solve this problem, we have developed new techniques for detecting and exploiting symmetry in the model structure, new reward variable specification techniques, and new state-space construction procedures. We propose a new method for detecting and exploiting model symmetry in which (1) models retain the structure of the system, and (2) all symmetry inherent in the structure of the model can be detected and exploited for the purposes of state-space reduction. Then, we extend the array of performance measures that may be derived from a given system model by introducing a class of path-based reward variables, which allow rewards to be accumulated based on sequences of states and transitions. Finally, we describe a new reward variable specification formalism and state-space construction procedure for automatically computing the appropriate level of state-space reduction based on the nature of the reward variables and the structural symmetry in the system model.
APA, Harvard, Vancouver, ISO, and other styles
27

Nelson, Catherine Elizabeth. "Methods for constructing 3D geological and geophysical models of flood basalt provinces." Thesis, Durham University, 2010. http://etheses.dur.ac.uk/488/.

Full text
Abstract:
In this thesis, realistic 3D geological models of flood basalt provinces are constructed. These models are based on outcrop observations and remote sensing data from the North Atlantic Igneous Province, collected by a variety of methods including terrestrial laser scanning. Geophysical data are added to the models to make them suitable for generating synthetic seismic data. Flood basalt provinces contain a number of different volcanic facies, distinguished by their outcrop appearance and physical properties. These include tabular-classic and compound-braided lava flows, intrusions and hyaloclastites. 3D models are constructed for tabular-classic lava flows based on satellite data from Iceland and laser scanning data from a variety of locations. Models for compound-braided lava flows are based on terrestrial laser scanning data and field observations from the Faroe Islands and the Isle of Skye. An additional finding of this work is that volcanic facies can be differentiated in wireline log data from boreholes. Facies show characteristic velocity distributions which can be linked to onshore observations and used to understand volcanic facies in offshore boreholes. Data from boreholes on the Faroe Islands are used to add seismic velocities to the 3D geological models above. This thesis also develops methods and workflows for constructing 3D geological models of flood basalt lava flows. The collection of digital 3D data using terrestrial laser scanning is evaluated, and data processing workflows are developed.
APA, Harvard, Vancouver, ISO, and other styles
28

Jeong, Namin. "A surfacelet-based method for constructing geometric models of microstructure." Diss., Georgia Institute of Technology, 2015. http://hdl.handle.net/1853/54438.

Full text
Abstract:
Integration of material composition, microstructure, and mechanical properties with geometry information enables many product development activities, including design, analysis, and manufacturing. To address such needs, models of material composition have been integrated into CAD systems, creating systems called heterogeneous CAD modeling. In order to support the heterogeneous CAD system, extensive process-structure-property relationships have to be captured and integrated into current CAD system. A new method for reverse engineering of materials will be presented such that microstructure models can be constructed and used in the heterogeneous CAD system. Reverse engineering of material consists of three parts: image analysis, structure-property-process relationship, and repository. In this research, an image processing method, which comprises the Radon transform and the wavelet transform, will be used in order to recognize geometric features from a microstructure image. Recognizing geometric features can be obtained by combinations of three techniques, masking, clustering, and high frequency component on wavelet transform, that are integrated with the Radon transform. Then, recognized geometric features can be used to construct an explicit geometric model of microstructure. The proposed work will provide an explicit mathematical method to recognize and to quantify microstructure features from an image. In addition, explicit geometric models of microstructure can be automatically constructed and utilized to get effective mechanical properties, establishing structure-property relationship of the material. In order to demonstrate this, polymer nano-composite sample and metal alloy sample will be used.
APA, Harvard, Vancouver, ISO, and other styles
29

O'Malley, Sean P. "Construction and testing of a modern acoustic impedance tube." Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2001. http://handle.dtic.mil/100.2/ADA393679.

Full text
APA, Harvard, Vancouver, ISO, and other styles
30

Kang, Leiter 1978. "Modern spectral estimation methods applied to FOPEN SAR imagery." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/86662.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

Jones, Damien. "Automated Rodent Sleep Analysis with Modern Machine Learning Methods." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-229398.

Full text
Abstract:
Sleep staging is the use of electrophysiological signals to determine the quality and architecture of sleep in an animal. Currently, to achieve this, researchers manually classify contiguous sections of electroencephalographic and electromyographic signals into defined sleep modes or stages. This is a slow and laborious process. Many attempts at semiautomated solutions can be found in the literature. In these attempts, a researcher manually classifies a portion of the data from a specific rodent. This data is used to train a model which can then be used to classify the rest of the data from that rodent. While such solutions can be found in commercial products, they still require hours of manual classification to be done by the researcher. In this thesis, I explore two machine learning methods in an attempt to fully automate the process of sleep staging. The automation consists of building a classifier that can classify data from a new rodent, using only manually classified data from previous rodents. This classifier should classify this new rodent’s data at a sufficiently high accuracy. While there have also been attempts at such a system in the past, none of them have reached a level of accuracy that is acceptable for use. The two methods implemented in this thesis are support vector machines (SVM) and convolutional neural networks (CNN). The results obtained are promising, with the results from SVM being on the cusp of real world usability for automated sleep staging.<br>Polysomnografi är användningen av elektrofysiologiska signaler för att bestämma kvaliteten och arkitekturen av sömn hos ett djur. För närvarande klassificerar forskare angränsande elektroencefalografiska och elektromyografiska signaler manuellt till vilolägen eller sömnstadier. Detta är en långsam och mödosam process. Många försök på halvautomatiserade lösningar finns i litteraturen. I dessa försök klassificerar en forskare manuellt en del av data från en specifik gnagare. Dessa data används sedan för att träna en modell som då kan användas för att klassificera resten av data från just den gnagaren. Även om sådana lösningar finns i kommersiella produkter, kräver de fortfarande timmar av manuell klassificering som ska göras av forskaren. I denna avhandling utforskar jag två maskininlärningsmetoder i ett försök att helt automatisera processen med polysomnografi. Automatiseringen består av att bygga en klassificerare som kan klassificera data från en ny gnagare, utifrån manuellt klassificerad data från en tidigare gnagare. Denna klassificerare bör klassificera den nya gnagarens data med tillräckligt hög noggrannhet. Tidigare försök till sådana system har gjorts utan att nå en noggranhet acceptabel för använding. De två metoder som implementeras i denna avhandling är stödvektormaskiner (eng. support vector machines (SVM)) och faltnings neuralt nätverk (eng. convolutional neural network (CNN)). De erhållna resultaten ser lovande ut, där resultaten från SVM gränsar till verklig användbarhet inom automatiserad polysomnografi.
APA, Harvard, Vancouver, ISO, and other styles
32

Kesiz, Abnousi Vartan. "Modern Econometric Methods for the Analysis of Housing Markets." Diss., Virginia Tech, 2021. http://hdl.handle.net/10919/103529.

Full text
Abstract:
The increasing availability of richer, high-dimensional, home sales data-sets, as well as spatially geocoded data, allows for the use of new econometric and computational methods to explore novel research questions. This dissertation consists of three separate research papers which aim to leverage this trend to answer empirical inferential questions, propose new computational approaches in environmental valuation, and address future challenges. The first research chapter estimates the effect on home values of 10 large-scale urban stream restoration projects situated near the project sites. The study area is the Johnson Creek Watershed in Portland, Oregon. The research design incorporates four matching model approaches that vary based on the temporal bands' width, a narrow and a wider band, and two spatial zoning buffers, a smaller and larger that account for the affected homes' distances. Estimated effects tend to be positive for six projects when the restoration projects' distance is smaller, and the temporal bands are narrow, while two restoration projects have positive effects on home values across all four modeling approaches. The second research chapter focuses on the underlying statistical and computational properties of matching methods for causal treatment effects. The prevailing notion in the literature is that there is a tradeoff between bias and variance linked to the number of matched control observations for each treatment unit. In addition, in the era of Big Data, there is a paucity of research addressing the tradeoffs between inferential accuracy and computational time across different matching methods. Is it worth employing computationally costly matching methods if the gains in bias reduction and efficiency are negligible? We revisit the notion of bias-variance tradeoff and address the subject of computational time considerations. We conduct a simulation study and evaluate 160 models and 320 estimands. The results suggest that the conventional notion of a bias-variance tradeoff, with bias increasing and variance decreasing with the number of matched controls, does not hold under the bias-corrected matching estimator (BCME), developed by Abadie and Imbens (2011). Specifically, for the BCME, the trend of bias decreases as the number of matches per treated unit increases. Moreover, when the pre-matching balance's quality is already good, choosing only one match results in a significantly larger bias under all methods and estimators. In addition, the genetic search matching algorithm, GenMatch, is superior compared to the baseline Greedy Method by achieving a better balance between the observed covariate distributions of the treated and matched control groups. On the down side, GenMatch is 408 times slower compared to a greedy matching method. However, when we employ the BCME on matched data, there is a negligible difference in bias reduction between the two matching methods. Traditionally, environmental valuation methods using residential property transactions follow two approaches, hedonic price functions and Random Utility sorting models. An alternative approach is the Iterated Bidding Algorithm (IBA), introduced by Kuminoff and Jarrah (2010). This third chapter aims to improve the IBA approach to property and environmental valuation compared to its early applications. We implement this approach in an artificially simulated residential housing market, maintaining full control over the data generating mechanism. We implement the Mesh Adaptive Direct Search Algorithm (MADS) and introduce a convergence criterion that leverages the knowledge of individuals' actual pairing to homes. We proceed to estimate the preference parameters of the distribution of an underlying artificially simulated housing market. We estimate with significantly higher precision than the original baseline Nelder-Mead optimization that relied only on a price discrepancy convergence criterion, as implemented during the IBAs earlier applications.<br>Doctor of Philosophy<br>The increasing availability of richer, high-dimensional, home sales data sets enables us to employ new methods to explore novel research questions involving housing markets. This dissertation consists of three separate research papers which leverage this trend. The first research paper estimates the effects on home values of 10 large-scale urban stream restoration projects in Portland, Oregon. These homes are located near the project sites. The results show that the distance of the homes from the project sites and the duration of the construction cause different effects on home values. However, two restorations have positive effects regardless of the distance and the duration period. The second research study is focused on the issue of causality. The study demonstrates that a traditional notion concerning causality known as the ``bias-variance tradeoff" is not always valid. In addition, the research shows that sophisticated but time-consuming algorithms have negligible effects in improving the accuracy of estimating the causal effects when we account for the required computational time. The third research study improves an environmental evaluation method that relies on residential property transactions. The methodology leverages the features of more informative residential data sets in conjunction with a more efficient optimization method, leading to significant improvements. The study concludes that due to these improvements, this alternative method can be employed to elicit the true preferences of homeowners over housing and locational characteristics by avoiding the shortcomings of existing techniques.
APA, Harvard, Vancouver, ISO, and other styles
33

Raof, Abdul-Hussein F. "Subject, theme and agent in Modern Standard Arabic." Thesis, University of Leeds, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.328901.

Full text
APA, Harvard, Vancouver, ISO, and other styles
34

Higham, Jeffrey T. "Construction methods for row-complete Latin squares." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 1997. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp04/nq21356.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Jiang, Hao, and 姜昊. "Construction and computation methods for biological networks." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2013. http://hub.hku.hk/bib/B50662144.

Full text
Abstract:
Biological systems are complex in that they comprise large number of interacting entities, and their dynamics follow mechanic regulations for movement and biological function organization. Established computational modeling deals with studying and manipulating biologically relevant systems as a powerful approach. Inner structure and behavior of complex biological systems can be analyzed and understood by computable biological networks. In this thesis, models and computation methods are proposed for biological networks. The study of Genetic Regulatory Networks (GRNs) is an important research topic in genomic research. Several promising techniques have been proposed for capturing the behavior of gene regulations in biological systems. One of the promising models for GRNs, Boolean Network (BN) has gained a lot of attention. However, little light has been shed on the analysis of internal connection between the dynamics of biological molecules and network systems. Inference and completion problems of a BN from a given set of singleton attractors are considered to be important in understanding the relationship between dynamics of biological molecules and network systems. Discrete dynamic systems model has been recently proposed to model time-course microarray measurements of genes, but delay effect may be modeled as a realistic factor in studying GRNs. A delay discrete dynamic systems model is developed to model GRNs. Inference and analysis of networks is one of the grand challenges in modern statistical biology. Machine learning method, in particular, Support Vector Machine (SVM), has been successfully applied in predictions of internal connections embedded in networks. Kernels in conjunction with SVM demonstrate strong ability in performing various tasks such as biomedical diagnosis, function prediction and motif extractions. In biomedical diagnosis, data sets are always high dimensional which provide a challenging research problem in machine learning area. Novel kernels using distance-metric that are not common in machine learning framework are proposed for possible tumor differentiation discrimination problem. Protein function prediction problem is a hot topic in bioinformatics. The K-spectrum Kernel is among the top popular models in description of protein sequences. Taking into consideration of positive-semi-definiteness in kernel construction, Eigen-matrix translation technique is introduced in novel kernel formulation to give better prediction result. In a further step, power of Eigen-matrix translation technique in feature selection is demonstrated through mathematical formulation. Due to structure complexity of carbohydrates, the study of carbohydrate sugar chains has lagged behind compared to that of DNA and proteins. A weighted q-gram kernel is constructed in classifying glycan structures with limitations in feature extractions. A biochemically-weighted tree kernel is then proposed to enhance the ability in both classification as well as motif extractions. Finally the problem of metabolite biomarker discovery is researched. Human diseases, in particular metabolic diseases, can be directly caused by the lack of essential metabolites. Identification of metabolite biomarkers has significant importance in the study of biochemical reaction and signaling networks. A promising computational approach is proposed to identify metabolic biomarkers through integrating biomedical data and disease-specific gene expression data.<br>published_or_final_version<br>Mathematics<br>Doctoral<br>Doctor of Philosophy
APA, Harvard, Vancouver, ISO, and other styles
36

Hopper, Rachel Anne. "Methods of construction of novel dendritic architectures." Thesis, University of Birmingham, 2002. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.396116.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Ayers, Kristin Lynn. "Methods for haplotype construction and their applications." Diss., Restricted to subscribing institutions, 2008. http://proquest.umi.com/pqdweb?did=1568065851&sid=1&Fmt=2&clientId=1564&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Jones, Doyle Michael. "Masonry ornament : applications of masonry construction in post-modern architecture." Thesis, Georgia Institute of Technology, 1987. http://hdl.handle.net/1853/24139.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Fok, Tat-man, and 霍達文. "A study of the pivotal construction in modern standard Chinese." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1985. http://hub.hku.hk/bib/B31948777.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Fok, Tat-man. "A study of the pivotal construction in modern standard Chinese." [Hong Kong : University of Hong Kong], 1985. http://sunzi.lib.hku.hk/hkuto/record.jsp?B12323834.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Boyacıoğlu, Bilgen Erkarslan Özlem. "The construction of turkish modern architecture in architectural history writing/." [s.l.]: [s.n.], 2003. http://library.iyte.edu.tr/tezler/master/mimarlik/T000289.rar.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

Chiropa, Moses Tinashe. "The project implementation profile's applicability to the modern construction industry." Master's thesis, University of Cape Town, 2018. http://hdl.handle.net/11427/29268.

Full text
Abstract:
This research evaluated the applicability of the current project implementation profile (PIP) tool to the modern construction industry. The research also aimed to identify any new critical success factors (CSFs) to deliver successful construction projects. The research questions were: a. How applicable are the success factors from the PIP tool in delivering successful modern construction projects? b. Are there other success factors that may be considered for inclusion in the PIP tool for modern construction projects? Critical success factors were identified through a deep literature review. An online webbased questionnaire with the critical success factors was then developed and this tool was utilized to gather data for the research from various project management stakeholders. Collected information was summarized, analyzed and discussed leading to a conclusion. The research identified a revised list of 10 key success factors (KSF) which comprised of 6 non-PIP factors which are: adequate budget; client requirements; competence of project manager; competence of contractors, subcontractors and suppliers; risk management and design and 4 PIP factors which are: client consultation, communication, client acceptance and top management support. From the 20 KSFs that were under investigation, “Adequate budget” was the factor that scored the highest and the lowest scored was “support from other departments.” The research also concluded that the success factors from the existing PIP tool are not sufficient in delivering successful modern construction projects and there are additional success factors that can be considered for inclusion in the PIP tool to aid modern construction projects success. To strengthen the PIP success factors in response to the research questions, it is necessary to execute additional research in this area; in particular the actual questions used by the tool and the assessment framework needs to be revised in light of this research.
APA, Harvard, Vancouver, ISO, and other styles
43

Nesbakken, Anders. "Evaluation of Modern Design Methods for use in Computer Experiments." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for matematiske fag, 2011. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-12582.

Full text
Abstract:
We have compared the recently developed Multi-level binary replacement (MBR) design method for use in computer experiments, to the Latin hypercube design (LHD) and the Orthogonal array (OA) design. For means of comparison, we have suggested an algorithm for drawing permutations of the MBR design, so as to obtain what we have called a MBR based Latin hypercube design. In our comparison study, the main focus have been the design scores with respect to the root mean squared error (RMSE), Max and alias sum of squares criteria. We found that the MBR design generally performed good with respect to all criteria. It scored similarly to the OA design method and better than conventional Latin hypercube sampling. The score however varied with the number of samples and the set of design generators chosen for constructing the MBR design. The MBR design performed better for designs with a relatively high number of samples compared to the number of factors.
APA, Harvard, Vancouver, ISO, and other styles
44

Shi, Xie-Qi. "Comparative studies of modern methods for caries detection and quantification /." Stockholm, 2001. http://diss.kib.ki.se/2001/91-628-4702-3/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
45

Dunkley, Joanna. "Modern methods for cosmological parameter estimation : beyond the adiabatic paradigm." Thesis, University of Oxford, 2005. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.441310.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Meyer, Roland. "Integrative taxonomy of decapod crustaceans with traditional and modern methods." Diss., Ludwig-Maximilians-Universität München, 2014. http://nbn-resolving.de/urn:nbn:de:bvb:19-170910.

Full text
Abstract:
Die als Zehnfußkrebse oder auch als Decapoda bezeichneten Arthropoden sind eine weltweit verbreitete, zum Teil hoch spezialisierte und vielseitig angepasste Gruppe, die in fast allen aquatischen Ökosystemen, aber auch in terrestrischen Habitaten zu finden ist. Die enorme Artenzahl von 17,635 rezent und fossil bekannten Arten (De Grave et al., 2009) sowie das hohe Alter der Gruppe an sich erschwert die systematische Eingliederung einzelner Arten. Fossile Funde von Dekapoden wurden bis ins Devon (vor 415 bis 359,2 Millionen Jahren) datiert (Schram et al., 1978). Damit haben die rezenten Vertreter viele Millionen Jahre Evolution durchlaufen und die Ergebnisse dieses langwierigen Prozesses schlagen sich in einer hohen morphologischen Vielfalt zwischen den Arten nieder. Um eine zuverlässige Phylogenie aufstellen und Arten eindeutig charakterisieren zu können sind neue Merkmale, Methoden und Ansätze erforderlich. Eine zuverlässige Bestimmung und Einordnung der verschiedenen Arten bildet die Basis für verschiedene Datenbanken und Projekte wie z.B. GenBank, Barcoding of Life (BOLD), German Barcode of Life (GBOL) oder Barcoding Fauna Bavarica und zeigt, welch hohen Stellenwert die Taxonomie besitzt. Ziel dieser kumulativen Dissertation ist es mit Einsatz von verschiedenen modernen morphologischen und molekularen Methoden wie der Rasterelektronenmikroskopie, der Fluoreszenzmikroskopie und der Analyse von mitochondrialen DNA-Sequenzen (Cytochrom-c-Oxydase) neue Merkmalssätze zur besseren Charakterisierung der verschiedenen Arten und deren Artabgrenzungen zu erarbeiten. Aber auch klassische Methoden wie das Abwägen von morphologischen Merkmalen, kommen in einem integrativen Ansatz zur Artabgrenzung zur Anwendung. Die in den Arbeiten angewandte Rasterelektronenmikroskopie erlaubt eine weitaus höhere Vergrößerung als die klassische Lichtmikroskopie bei gleichzeitig höherer Auflösung und Schärfentiefe. Somit konnten auch kleinste eidonomische (Bestimmungs-) Merkmale wie das Dorsalorgan oder einzelne Setae-Typen bei Zoea-Larven detailliert beschrieben und als neue oder früher wenig beachtete morphologischen Merkmale zur systematischen Einordnung herangezogen werden (Publikationen I, II und III). Des Weiteren konnte mit Hilfe der Fluoreszenzmikroskopie anhand von DAPI-Färbungen gezeigt werden, dass die Anordnung der Zellkerne von Zoea-Larven aus den verschiedenen Unterordnungen Caridea, Anomura und Brachyura charakteristische Muster aufweist. Dieses Kriterium wird als möglicher Merkmalssatz in der Taxonomie diskutiert (Publikation VI). Ein weiteres Feld der modernen Taxonomie wird durch Publikation V abgedeckt: molekulare Analysen auf der Basis des mitochondrialen proteincodierenden Genes COI (cytochrome oxidase subunit 1) bzw „barcoding“-Gens. Zum ersten Mal für die südchilenische Fjordregion wurde mit dem Ansatz der integrativen Taxonomie die dortige Dekapoda-Fauna erfasst und analysiert. Nahe verwandte Arten der Gattungen Eurypodius Guérin, 1825 und Acanthocyclus Lucas, in H. Milne Edwards & Lucas, 1844, die morphologisch schwer zu trennen sind, konnten neu charakterisiert werden. In der Arbeit wurden klassische, morphologische Merkmale mit molekularen, morphologieunabhängigen Merkmalen kombiniert. Durch eine vorherige Inventarisierung der südchilenischen Dekapodenfauna während zahlreicher Expeditionen in die Region konnte zudem die Basis für die taxonomische Arbeit (ca. 650 Samples sind in der Zoologischen Staatssammlung München hinterlegt) geschaffen werden. Eine ausführlichen Dokumentation mit verschiedenen bildgebenden Methoden wie der Verwendung von tiefenscharfen Aufnahmen und in situ Fotos der verschiedenen Arten dieser noch nahezu unerforschten Region bildet das Rückgrat der taxonomischen Arbeiten und ist als Kapitel in dem zweisprachigen (Spanisch und Englisch) Standardwerk für die Region publiziert (PublikationVI).<br>Decapod crustaceans are a highly diverse and well adapted group belonging to the phylum Arthropoda. Representatives can be found in most aquatic ecosystems and in terrestrial habitats. The huge number of species, about 17,635 recent and fossil species are known (De Grave et al., 2009), but also the old age of the group makes a systematic classification of single species difficult. Fossil decapods were dated back to the Devonian (about 415 Mya to 359,2 Mya) (Schram et al., 1978). Because of the old age of the group there has been ample time for evolution. The results of this ongoing process are reflected in an enormous morphological variety among the species. For a coherent classification of this group and species determination it could be essential to establish new morphological features and combine new methods. Furthermore a proper identification and classification of species forms the basis of various databanks and projects e.g. GenBank, Barcoding of Life (BOLD), German Barcode of Life (GBOL) and the Barcoding Fauna Bavarica show the high significance of taxonomist’s work. The aim of this dissertation is to find and establish new features for the classification of decapods by various modern morphological methods i.e. scanning electron microscopy (SEM), fluorescence microscopy and morphology independent features like the analyses of gene sequences (cytochrome oxidase subunit 1). But additionally, classical methods like the use of morphological features in a combined, integrative approach are used for species delineation. In different publications we used SEM techniques which allow us in comparison to light microscopy a closer examination of morphological features (article I, II and III). It was possible to describe the dorsal organ and the different types of setae of zoea larvae in detail and use these features for systematic classification. Furthermore we used fluorescence microscopy and DAPI staining to describe and characterize nucleus patterns in various representatives of Decapoda of the Infraorders Caridea, Anomura and Brachyura. Results of these examinations show that nucleus patterns are characteristic for each Infraorder. In Article VI this feature is discussed as possible taxonomic criterion. In recent times molecular taxonomy gains more and more in importance. Integrative taxonomy combines sequence analyses of the COI gene (cytochrome oxidase subunit 1) or “barcoding gene” with classic morphological features. It is used to characterize and analyze the decapod fauna of the southern Chilean fjord region (article V). Furthermore, on the basis of our data, it was possible to give exact species descriptions for closely related and not always clear to distinguish representatives of the genera Eurypodius Guérin, 1825 and Acanthocyclus Lucas, in H. Milne Edwards & Lucas, 1844. As a backbone for this study serve the results of an intensive inventory of the southern Chilean fiord region. During various expeditions in that region about 650 samples of decapods were collected and are now deposited for further investigations at the Bavarian State Collection of Zoology. Samples are documented in detail using different imaging methods i.e. in situ pictures and high resolution photos of the different species of this unique and unexplored region are published as a chapter in the bilingual (Spanish and English) standard work for the Chilean fjord region (article VI).
APA, Harvard, Vancouver, ISO, and other styles
47

Akufo, Kwabena D. "Theological bases for modern management methods for the local church." Online full text .pdf document, available to Fuller patrons only, 2004. http://www.tren.com.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Thiebaud, Maribel Alvarez de. "Estimation of viable cell count by modern and improved methods." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9883.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

Roshyara, Nab Raj, Katrin Horn, Holger Kirsten, Peter Ahnert, and Markus Scholz. "Comparing performance of modern genotype imputation methods in different ethnicities." Universitätsbibliothek Leipzig, 2016. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-213865.

Full text
Abstract:
A variety of modern software packages are available for genotype imputation relying on advanced concepts such as pre-phasing of the target dataset or utilization of admixed reference panels. In this study, we performed a comprehensive evaluation of the accuracy of modern imputation methods on the basis of the publicly available POPRES samples. Good quality genotypes were masked and re-imputed by different imputation frameworks: namely MaCH, IMPUTE2, MaCH-Minimac, SHAPEIT-IMPUTE2 and MaCH-Admix. Results were compared to evaluate the relative merit of pre-phasing and the usage of admixed references. We showed that the pre-phasing framework SHAPEIT-IMPUTE2 can overestimate the certainty of genotype distributions resulting in the lowest percentage of correctly imputed genotypes in our case. MaCH-Minimac performed better than SHAPEIT-IMPUTE2. Pre-phasing always reduced imputation accuracy. IMPUTE2 and MaCH-Admix, both relying on admixed-reference panels, showed comparable results. MaCH showed superior results if well-matched references were available (Nei’s GST ≤ 0.010). For small to medium datasets, frameworks using genetically closest reference panel are recommended if the genetic distance between target and reference data set is small. Our results are valid for small to medium data sets. As shown on a larger data set of population based German samples, the disadvantage of pre-phasing decreases for larger sample sizes.
APA, Harvard, Vancouver, ISO, and other styles
50

Nyman, Ellinor. "Cryptography : A study of modern cryptography and its mathematical methods." Thesis, Uppsala universitet, Analys och sannolikhetsteori, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-447460.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography