To see the other types of publications on this topic, follow the link: Respiration – Measurement – Computer programs.

Dissertations / Theses on the topic 'Respiration – Measurement – Computer programs'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 31 dissertations / theses for your research on the topic 'Respiration – Measurement – Computer programs.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Haden, Lonnie A. "A numerical procedure for computing errors in the measurement of pulse time-of-arrival and pulse-width." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9849.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Metz, Gale Lynn. "The units of measure consistency checker for the entity-relationship-attribute requirements model." Thesis, Kansas State University, 1986. http://hdl.handle.net/2097/9941.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Shende, Sameer Suresh. "The role of instrumentation and mapping in performance measurement /." view abstract or download file of text, 2001. http://wwwlib.umi.com/cr/uoregon/fullcit?p3024533.

Full text
Abstract:
Thesis (Ph. D.)--University of Oregon, 2001.
Typescript. Includes vita and abstract. Includes bibliographical references (leaves 141-156). Also available for download via the World Wide Web; free to University of Oregon users.
APA, Harvard, Vancouver, ISO, and other styles
4

Curtis, Ronald Sanger. "Data structure complexity metrics." Buffalo, N.Y. : Dept. of Computer Science, State University of New York at Buffalo, 1994. http://www.cse.buffalo.edu/tech%2Dreports/94%2D39.ps.Z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Goff, Roger Allen. "Complexity measurement of a graphical programming language and comparison of a graphical and a textual design language." Thesis, Virginia Tech, 1987. http://hdl.handle.net/10919/45686.

Full text
Abstract:

For many years the software engineering community has been attacking the software reliability problem on two fronts. First via design methodologies, languages and tools as a precheck on quality and second by measuring the quality of produced software as a postcheck. This research attempts to unify the approach to creating reliable software by providing the ability to measure the quality of a design prior to its implementation. Also presented is a comparison of a graphical and a textual design language in an effort to support cognitive science research findings that the human brain works more effectively in images than in text.


Master of Science
APA, Harvard, Vancouver, ISO, and other styles
6

Janc, Artur Adam. "Network Performance Evaluation within the Web Browser Sandbox." Digital WPI, 2009. https://digitalcommons.wpi.edu/etd-theses/112.

Full text
Abstract:
With the rising popularity of Web-based applications, the Web browser platform is becoming the dominant environment in which users interact with Internet content. We investigate methods of discovering information about network performance characteristics through the use of the Web browser, requiring only minimal user participation (navigating to a Web page). We focus on the analysis of explicit and implicit network operations performed by the browser (JavaScript XMLHTTPRequest and HTML DOM object loading) as well as by the Flash plug-in to evaluate network performance characteristics of a connecting client. We analyze the results of a performance study, focusing on the relative differences and similarities between download, upload and round-trip time results obtained in different browsers. We evaluate the accuracy of browser events indicating incoming data, comparing their timing to information obtained from the network layer. We also discuss alternative applications of the developed techniques, including measuring packet reception variability in a simulated streaming protocol. Our results confirm that browser-based measurements closely correspond to those obtained using standard tools in most scenarios. Our analysis of implicit communication mechanisms suggests that it is possible to make enhancements to existing “speedtest” services by allowing them to reliably determine download throughput and round-trip time to arbitrary Internet hosts. We conclude that browser-based measurement using techniques developed in this work can be an important component of network performance studies.
APA, Harvard, Vancouver, ISO, and other styles
7

Bowers, Victoria A. "Concurrent versus retrospective verbal protocol for comparing window usability." Diss., This resource online, 1990. http://scholar.lib.vt.edu/theses/available/etd-09162005-115003/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kobata, Robert Steven 1954. "AN AUTOMATED METHOD OF MEASURING ISOLATED MUSCLE CONTRACTION (VERAPAMIL, HALOTHANE, CALCIUM-CHLORIDE, MAGNESIUM SULFATE, GUINEA PIG)." Thesis, The University of Arizona, 1986. http://hdl.handle.net/10150/277003.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Pochobradsky, Pavel. "Computerized system for time-motion analysis." Thesis, McGill University, 1994. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=26306.

Full text
Abstract:
Regular participation in sports is a common practice among the general population. For cardiovascular fitness, the frequency, duration, intensity and mode of the activity must be appropriate for the individual to benefit from the activity. The benefits for cardiovascular fitness are questionable in sports involving high intensity intermittent exercise of short duration. In the past, the procedures for determination of the heart rate and the time-motion characteristics of an activity were cumbersome and time consuming, thus making application to sports an inconvenience. The purpose of this project was to develop a computer based system for matching heart rate data with time-motion characteristics. The system was tested using ice hockey and squash. Subjects were observed during activity. The heart rate data during the activity were collected using a Polar Vantage XL Heart Rate monitor set to record heart rate values in 5-second intervals. The duration and type of activity were entered in the computer in real time using a coding system. Program combined the time-motion analysis with the heart rates that were downloaded from the Polar heart rate monitor. The results were summarized as follows: (1) total time at each intensity level, (2) mean duration at each intensity level, and (3) mean heart rate at each intensity level. Output from the computer program was similar to manual calculations.
APA, Harvard, Vancouver, ISO, and other styles
10

Pragasam, Ravi L. "The MC68701 based spectrum analyzer." Thesis, Kansas State University, 1985. http://hdl.handle.net/2097/9872.

Full text
APA, Harvard, Vancouver, ISO, and other styles
11

Qi, Haiming. "Analysis and design of a contact pressure distribution measuring system." Thesis, McGill University, 1987. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=64066.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Suoja, Nicole Marie. "Directional wavenumber characteristics of short sea waves." Thesis, Massachusetts Institute of Technology, 2000. http://hdl.handle.net/1721.1/88473.

Full text
Abstract:
Thesis (Ph. D.)--Joint Program in Applied Ocean Science and Engineering (Massachusetts Institute of Technology, Dept. of Ocean Engineering; and the Woods Hole Oceanographic Institution), 2000.
Includes bibliographical references (leaves 134-141).
by Nicole Marie Suoja.
Ph.D.
APA, Harvard, Vancouver, ISO, and other styles
13

Simpson, Charles Robert Jr. "A Distributed Approach to Passively Gathering End-to-End Network Performance Measurements." Thesis, Georgia Institute of Technology, 2004. http://hdl.handle.net/1853/5199.

Full text
Abstract:
NETI@home is an open-source software package that collects network performance statistics from end-systems. It has been written for and tested on the Windows, Solaris, and Linux operating systems, with testing for other operating systems to be completed soon. NETI@home is designed to run on end-user machines and collect various statistics about Internet performance. These statistics are then sent to a server at the Georgia Institute of Technology, where they are collected and made publicly available. This tool gives researchers much needed data on the end-to-end performance of the Internet, as measured by end-users. NETI@homes basic approach is to sniff packets sent from and received by the host and infer performance metrics based on these observed packets. NETI@home users are able to select a privacy level that determines what types of data are gathered, and what is not reported. NETI@home is designed to be an unobtrusive software system that runs quietly in the background with little or no intervention by the user, and using few resources.
APA, Harvard, Vancouver, ISO, and other styles
14

O'Mara, David Thomas John. "Automated facial metrology." University of Western Australia. School of Computer Science and Software Engineering, 2002. http://theses.library.uwa.edu.au/adt-WU2003.0015.

Full text
Abstract:
Automated facial metrology is the science of objective and automatic measurement of the human face. There are many reasons for measuring the human face. Psychologists are interested in determining how humans perceive beauty, and how this is related to facial symmetry [158]. Biologists are interested in the relationship between symmetry and biological fitness [124]. Anthropologists, surgeons, forensic experts, and security professionals can also benefit from automated facial metrology [32, 101, 114]. This thesis investigates the concept of automated facial metrology, presenting original techniques for segmenting 3D range and colour images of the human head, measuring the bilateral symmetry of n-dimensional point data (with particular emphasis on measuring the human head), and extracting the 2D profile of the face from 3D data representing the head. Two facial profile analysis techniques are also presented that are incremental improvements over existing techniques. Extensive literature reviews of skin colour modelling, symmetry detection, symmetry measurement, and facial profile analysis are also included in this thesis. It was discovered during this research that bilateral symmetry detection using principal axes is not appropriate for detecting the mid-line of the human face. An original mid-line detection technique that does not use symmetry, and is superior to the symmetry-based technique, was developed as a direct result of this discovery. There is disagreement among researchers about the effect of ethnicity on skin colour. Some researchers claim that people from different ethnic groups have the same skin chromaticity (hue, saturation) [87, 129, 206], while other researchers claim that different ethnic groups have different skin colours [208, 209]. It is shown in this thesis that people from apparently different ethnic groups can have skin chromaticity that is within the same Gaussian distribution. The chromaticity-based skin colour model used in this thesis has been chosen from the many models previously used by other researchers, and its applicability to skin colour modelling has been justified. It is proven in this thesis that the Mahalanobis distance to the skin colour distribution is Gaussian in both the chromatic and normalised rg colour spaces. Most facial profile analysis techniques use either tangency or curvature to locate anthropometric features along the profile. Techniques based on both approaches have been implemented and compared. Neither approach is clearly superior to the other, but the results indicate that a hybrid technique, combining both approaches, could provide significant improvements. The areas of research most relevant to facial metrology are reviewed in this thesis and original contributions are made to the body of knowledge in each area. The techniques, results, literature reviews, and suggestions presented in this thesis provide a solid foundation for further research and hopefully bring the goal of automated facial metrology a little closer to being achieved.
APA, Harvard, Vancouver, ISO, and other styles
15

Woods, Juliette Aimi. "Numerical accuracy of variable-density groundwater flow and solute transport simulations." Title page, contents and abstract only, 2004. http://web4.library.adelaide.edu.au/theses/09PH/09phw8941.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Kim, Kye Hyun 1956. "Classification of environmental hydrologic behaviors in Northeastern United States." Thesis, The University of Arizona, 1989. http://hdl.handle.net/10150/277083.

Full text
Abstract:
Environmental response to acidic deposition occurs through the vehicle of water movement in the ecosystem. As a part of the environmental studies for acidic deposition in the ecosystem, output-based hydrologic classification was done from basin hydrologies based on the distribution of the baseflow, snowmelt, and the direct runoff sources. Because of the differences in the flow paths and exposure duration, those components were assumed to represent distinct geochemical responses. As a first step, user-friendly software has been developed to calculate the baseflow based on the separation of annual hydrographs. It also generates the hydrograph for visual analysis using trial separation slope. After the software was completed, about 1200 stream flow gauging stations in Northeastern U.S. were accessed for flow separation and other hydrologic characteristics. At the final stage, based on the output from the streamflow analysis, cluster analysis was performed to classify the streamflow behaviors in terms of acidic inflow. The output from the cluster analysis shows more efficient regional boundaries of the subregions than the current regional boundaries used by U.S. Environmental Protection Agency (U.S.E.P.A.) for the environmental management in terms of acidic deposition based on the regional baseflow properties.
APA, Harvard, Vancouver, ISO, and other styles
17

Suaide, André Luis Alarcon do Passo. "Desenvolvimento e validação de uma ferramenta computacional para mensuração das curvaturas da coluna vertebral." Universidade de São Paulo, 2008. http://www.teses.usp.br/teses/disponiveis/47/47135/tde-03042009-090940/.

Full text
Abstract:
A coluna vertebral desempenha um papel importante no dia a dia oferecendo suporte necessário à movimentação do tronco além da locomoção do ser humano, sendo o seu funcionamento diretamente relacionado com a qualidade de vida; sua disposição e articulação das vértebras proporcionam estabilidade e flexibilidade à coluna vertebral, atributos necessários para o equilíbrio do ser humano. A hipercifose torácica, hiperlordose lombar e escoliose são as patologias posturais mais comuns e podem ser diagnosticadas pela mensuração do ângulo da curvatura da coluna vertebral. Comumente a radiografia é utilizada como método para a mensuração de tais ângulos, porém, além de ser um método caro, é altamente invasivo por causa da exposição à radiação, por isso não é aconselhado praticá-lo muitas vezes, sendo o acompanhamento do tratamento difícil de ser feito. Há diversos métodos não invasivos, porém não combinam praticidade, baixo custo e análise tridimensional da curvatura, sendo eficazes em mensurar apenas a coluna no plano sagital. Por esses motivos, para o uso clínico, o profissional de saúde necessita de um método barato, confiável, prático, que atenda suas necessidades e não invasivo para a mensuração da curvatura da coluna vertebral. O objetivo desse trabalho foi desenvolver e validar com a Cinemetria composto de cinco câmeras infravermelhas uma ferramenta computacional (LoB Analytics) para mensurar esse ângulos, o software terá código aberto e uso gratuito. A média de todos os ângulos obtida pela Cinemetria foi de 43,4±18,5º e pelo LoB Analytics foi de 43,9±17,7º, com uma correlação muito forte de 0,98. Foram feitas regressões lineares que confirmaram que ângulos calculados pelo LoB Analytics são tão confiáveis quanto os calculados pela Cinemetria, que é um método bastante utilizado hoje em dia. Os grandes diferenciais do LoB Analytics sobre a Cinemetria são: o seu baixo custo e a praticidade de usá-lo em laboratórios e clínicas.
The spine has an important role day by day offering the necessary flexibility for movement of the trunk besides support and locomotion of humans and being is directly related to the quality of life. The provision and articulation of the vertebrae are responsible for the stability and flexibility of spine attributes necessary for the balance. The thoracic hiper and hypo kyphosis, lumbar hiper and hypo lordosis and scoliosis are the most common postural pathologies and can be diagnosed by measuring the angle of the curvature of the spine. The radiographic method has been the most popular method for such measurement, however, it is an expensive method and also invasive (because its exposure to radiation), it is not the method more indicated to be used repeatedly, being the monitoring of treatment difficult to be done. There are several noninvasive methods but they do not combine practicality, low cost and three-dimensional analysis of the spine curvature, and they are effective in measuring the spine only in the sagittal plane. For these reasons, the health professional needs a method that has low cost, reliable, practical and noninvasive methods for measurement of the curvature of the spine. The goal of this work was develop and validate, with motion capture system (Cinemetria), a computational tool for this measurement (LoB Analytics), which will be free and open source. The mean of all Cinemetria angles was 43,9±18,5º and LoB Analytics angles was 43,9±17,7, with a very strong correlation coefficient of 0,98. Linear regressions confirmed that LoB Analytics angles are as reliable as those calculated by Cinemetria, which is a method widely used today. The differentials of LoB Analytics on Cinemetria are: low cost and practicality of using it in laboratories and clinics.
APA, Harvard, Vancouver, ISO, and other styles
18

Batista, Gabriela de Fatima. "Programa de medição para organizações de alta maturidade." [s.n.], 2005. http://repositorio.unicamp.br/jspui/handle/REPOSIP/260163.

Full text
Abstract:
Orientadores: Mario Jino
Dissertação (mestrado) - Universidade Estadual de Campinas, Faculdade de Engenharia Eletrica e de Computação
Made available in DSpace on 2018-08-04T03:55:01Z (GMT). No. of bitstreams: 1 Batista_GabrieladeFatima_M.pdf: 1354598 bytes, checksum: 95bb9bb4e444a6f9b717df69650f4e8b (MD5) Previous issue date: 2005
Resumo: Organizações de alto nível de maturidade têm como meta principal a melhoria contínua de seus processos. Essas organizações usam sistematicamente métricas e fazem análise dos dados coletados para tomada de decisão, ou seja, fazem efetivamente gerenciamento por dados. Avaliação quantitativa da capacidade do processo de software definido para o projeto e suas variações permite planejar e gerenciar melhor os projetos. Considerando a necessidade de medir, prever e ajustar o processo de software para alcançar as metas de qualidade, um programa de medição é proposto com o intuito de dar suporte à gerência quantitativa. O programa de medição apresenta métricas alinhadas às metas organizacionais e exige que, após a coleta dos dados e sua análise, os envolvidos nessas métricas - um gerente administrativo, um gerente funcional, um líder de projeto ou um desenvolvedor - comprometam-se a usar os resultados da análise para identificar os desvios de processo e aplicar as ações corretivas necessárias; desta forma, pode-se controlar o desempenho do processo de desenvolvimento de software dentro dos limites aceitáveis. Para apoiar o processo de implantação e aplicação de métricas, uma ferramenta de coleta, validação e análise dos dados, baseada em controle estatístico de processo, denominada Vigia, foi desenvolvida. Vigia pode ser usada para controlar o desempenho do processo de software definido para o projeto assegurando que o processo não compromete as metas de qualidade da organização nem as metas de negócio, por meio de ações corretivas em tempo real e, conseqüentemente, de ajustes no processo de software. Um estudo de caso foi realizado na Motorola Industrial para avaliar tanto o programa de medição como a ferramenta Vigia
Abstract: Organizations of a high level of maturity have as main goal the continuous improvement of their processes. Such organizations systematically apply metrics by measuring process performance and analyzing these measurements to make decisions; hence, they effectively perform management by data. Quantitative assessment of the performance of the project's defined software process and its variations allows better planning and management of projects. Considering the need for measuring, predicting and adjusting the software process to reach quality goals, a measurement program is being proposed to give support to quantitative management. The measurement program presents metrics aligned to the organizational goals and requires that, after data collection and analysis, the metrics stakeholders - a senior manager, a functional manager, a project leader or a developer - be committed to use the results of the analysis to identify process deviations and to apply the necessary corrective actions. In this way, we may control the performance of the project's defined software process within acceptable limits. To support deployment of the measurement process and application of metrics, a tool to collect, validate and analyze data, based on statistical process control, called Vigia, was developed. Vigia can be used to control the performance of the project's defined software process, assuring that the process does not compromise neither the organizational quality goals nor the business goals through corrective actions in near-real time. Consequently, it carries through adjustments in the software process. A case study was carried out at Motorola Company to evaluate the measurement program as well as the Vigia tool
Mestrado
Engenharia de Computação
Mestre em Engenharia Elétrica
APA, Harvard, Vancouver, ISO, and other styles
19

Pearcy, Charles M. "The impact of background resolution on Target Acquisitions Weapons Software (TAWS) sensor performance." Thesis, Monterey, California. Naval Postgraduate School, 2005. http://hdl.handle.net/10945/2232.

Full text
Abstract:
Approved for public release, distribution is unlimited
This study evaluated the sensitivity of TAWS detection range calculations to the spatial resolution of scenario backgrounds. Sixteen independent sites were analyzed to determine TAWS background. Multispectral satellite data were processed to different spatial resolutions from 1m to 8km. The resultant imagery was further processed to determine TAWS background type. The TAWS background type was refined to include soil moisture characteristics. Soil moisture analyses were obtained using in situ measurements, the Air Force's Agricultural-Meteorological (AGRMET) model and the Army's Fast All-seasons Soil Strength (FASST) model. The analyzed imagery was compared to the current default 1o latitude by 1o of longitude database in TAWS. The use of the current default TAWS background database was shown to result in TAWS ranges differing from the 1m standard range by 18-23%. The uncertainty was reduced to 5% when background resolution was improved to 8km in rural areas. By contrast, in urban regions the uncertainty was reduced to 14% when spatial resolution was reduced to 30m. These results suggest that the rural and urban designations are important to the definition of a background database.
First Lieutenant, United States Air Force
APA, Harvard, Vancouver, ISO, and other styles
20

Masters, Michael Harry. "A computer-based respiratory measurement system and a temperature transducer for monitoring respiratory flow temperature." 1985. http://hdl.handle.net/2097/27494.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Schmalzried, Terry Eugene. "Classification of wheat kernels by machine-vision measurement." 1985. http://hdl.handle.net/2097/27530.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

Hinga, Mark Brandon. "Using parallel computation to apply the singular value decomposition (SVD) in solving for large Earth gravity fields based on satellite data." Thesis, 2004. http://hdl.handle.net/2152/1190.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Levine, Jacob Harrison. "Learning cell states from high-dimensional single-cell data." Thesis, 2016. https://doi.org/10.7916/D88052DM.

Full text
Abstract:
Recent developments in single-cell measurement technologies have yielded dramatic increases in throughput (measured cells per experiment) and dimensionality (measured features per cell). In particular, the introduction of mass cytometry has made possible the simultaneous quantification of dozens of protein species in millions of individual cells in a single experiment. The raw data produced by such high-dimensional single-cell measurements provide unprecedented potential to reveal the phenotypic heterogeneity of cellular systems. In order to realize this potential, novel computational techniques are required to extract knowledge from these complex data. Analysis of single-cell data is a new challenge for computational biology, as early development in the field was tailored to technologies that sacrifice single-cell resolution, such as DNA microarrays. The challenges for single-cell data are quite distinct and require multidimensional modeling of complex population structure. Particular challenges include nonlinear relationships between measured features and non-convex subpopulations. This thesis integrates methods from computational geometry and network analysis to develop a framework for identifying the population structure in high-dimensional single-cell data. At the center of this framework is PhenoGraph, and algorithmic approach to defining subpopulations, which when applied to healthy bone marrow data was shown to reconstruct known immune cell types automatically without prior information. PhenoGraph demonstrated superior accuracy, robustness, and efficiency, compared to other methods. The data-driven approach becomes truly powerful when applied to less characterized systems, such as malignancies, in which the tissue diverges from its healthy population composition. Applying PhenoGraph to bone marrow samples from a cohort of acute myeloid leukemia (AML) patients, the thesis presents several insights into the pathophysiology of AML, which were extracted by virtue of the computational isolation of leukemic subpopulations. For example, it is shown that leukemic subpopulations diverge from healthy bone marrow but not without bound: Leukemic cells are apparently free to explore only a restricted phenotypic space that mimics normal myeloid development. Further, the phenotypic composition of a sample is associated with its cytogenetics, demonstrating a genetic influence on the population structure of leukemic bone marrow. The thesis goes on to show that functional heterogeneity of leukemic samples can be computationally inferred from molecular perturbation data. Using a variety of methods that build on PhenoGraph's foundations, the thesis presents a characterization of leukemic subpopulations based on an inferred stem-like signaling pattern. Through this analysis, it is shown that surface phenotypes often fail to reflect the true underlying functional state of the subpopulation, and that this functional stem-like state is in fact a powerful predictor of survival in large, independent cohorts. Altogether, the thesis takes the existence and importance of cellular heterogeneity as its starting point and presents a mathematical framework and computational toolkit for analyzing samples from this perspective. It is shown that phenotypic and functional heterogeneity are robust characteristics of acute myeloid leukemia with clinically significant ramifications.
APA, Harvard, Vancouver, ISO, and other styles
24

Hu, Zili. "Development of PHP to UPML transformer." 2013. http://liblink.bsu.edu/uhtbin/catkey/1719998.

Full text
Abstract:
This thesis developed a new markup language based on eXtensible Markup Language (XML), named as the Unified Programming Markup Language (UPML), which represents an abstraction of programming techniques of popular programming languages, and is used to store the programming semantic information of various programming languages. UPML aims to provide a general software quality analysis platform and as a gateway to translate programs between high-level programming languages. This research created and analyzed the features of UPML and concluded that UPML may have advantages over the traditional and newly appeared methods in software quality analysis and programming language translation. As the proof of concept in building such a software analysis and translation system, this research developed a PHP to UPML transformer. Execution examples showed its correctness of working in the core programming area of popular programming techniques, structure programming (SP) and object-oriented programming (OOP). The PHP implementation can be easily applied to other programming languages that support the same programming techniques. Since UPML is extensible, languages of other programming paradigms beyond the SP and OOP can be easily added.
Related work -- Analysis of programming languages and programming techniques -- Implementation of UPML -- Implementation of a PHP to UPML transformer -- Examples of execution.
Department of Computer Science
APA, Harvard, Vancouver, ISO, and other styles
25

Ghebreab, Tesfalidet Alem. "Modelling the soil water balance and applications using a decision support system (DSSAT v3.5)." 2003. http://hdl.handle.net/10413/3579.

Full text
Abstract:
Water is a scarce resource used by various stakeholders. Agriculture is one of the users of this resource especially for growing plants. Plants need to take up carbon dioxide to prepare their own food. For this purpose plants have stomatal openings. These same openings are used for transpiration. Quantifying transpiration is important for efficient water resource management and crop production because it is closely related to dry matter production. Transpiration could be measured using a number of methods or calculated indirectly through quantification of the soil water balance components using environmental instruments. The use of models such as the Decision Support System for Agrotechnology Transfer (DSSAT v3.5) is, however, much easier than environmental instruments. Nowadays, with increased capabilities of computers, the use of crop simulation modelling has become a common practice for various applications. But it is important that models, such as DSSAT v3.5, be calibrated and verified before being used for various applications such as long-term risk assessment, evaluation of cultural practices and other applications. In this study the model inputs have been collected first Then the model was calibrated and verified. Next sensitivitY analysis was carried to observe the model behavior to changes in inputs. Finally the model has been applied for long-term risk assessment and evaluation of cultural practices. In this study, the data collected formed the basis forthe minimum dataset needed for running the DSSAT v3.5 model. In addition, the factory given transmission of shading material over a tomato crop was compared to actual measurements. Missing weather data (solar irradiance, minimum and maximum air temperature and rainfall) were completed after checking that it was homogeneous to measurements from nearby automatic weather station. It was found that factory-given transmission value of 0.7 of the shade cloth was different from the actual one of 0.765. So this value was used for conversion of solar irradiance measured outside the shade cloth to solar irradiance inside the shade cloth. Conventional laboratory procedures were used for the analysis of soil physical and chemical properties. Soil water content limits were determined using texture and bulk density regression based equations. Other model inputs were calculated using the DSSAT model. Crop management inputs were also documented for creation of the experimental details file. The DSSATv3.5 soil water balance model was calibrated for soil, plant and weather conditions at Ukulinga by modifying some of its inputs and then simulations of the soil water balance components were evaluated against actual measurements. For this purpose half of the data available was used for calibration and the other half for verification. Model simulations of soil water content (150 to 300 mm and 450 to 600 mm) improved significantly after calibration. In addition, simulations of leaf area index (LA!) were satisfactory. Simulated evapotranspiration (E1) had certain deviations from the measured ET because the latter calculated ET by multiplying the potential ET with constant crop multiplier so-called the crop coefficient. Sensitivity analysis and long-term risk assessments for yield, runoff and drainage and other model outputs were carried out for soil, plant and weather conditions at Ukulinga. For this purpose, some of the input parameters were varied individually to determine the effect on seven model output parameters. In addition, long-term weather data was used to simulate yield, biomass at harvest, runoff and drainage for various initial soil water content values. The sensitivity analysis gave results that conform to the current understanding of the soil-plant atmosphere system. The long-term assessment showed that it is risky to grow tomatoes during the winter season at Ukulinga irrespective of the initial soil water content unless certain measures are taken such as the use of mulching to protect the plants from frost. The CROPGRO-Soya bean model was used to evaluate the soil water balance and gro'W1:h routines for soil, plant and weather conditions at Cedara. In addition, cultural practices such as row spacing, seeding rate and cultivars were also evaluated using longterm weather data. Simulations of soil water content were unsatisfactory even after calibration of some of the model parameters. Other model parameters such as LAI, yield and flowering date had satisfactory agreement with observed values. Results from this study suggest that the model is sensitive to weather and cultural practices such as seeding rates, row spacing and cultivar maturity groups. The general use of decision support systems is limited by various factors. Some of the factors are: unclear definition of clients/end users; no end user input prior to or during the development of the DSS; DSS does not solve the problems that the client is experiencing; DSS do not match their decision-making style; producers see no reason to change the current management practices; DSS does not provide benefit over current decision-making system; limited computer ownership amongst producers; lack of field testing; producers do not trust the output due to the lack of understanding of the underlying theories of the models utilized; cannot access the necessary data inputs; lack of technical support; lack of training in the development ofDSS software; marketing and support constraints; institutional resistances; short shelf-life of DSS software; technical constraints, user constraints and other constraints. For successful use of DSS, the abovementioned constraints have to be solved before their useful impacts on farming systems could be realized. This study has shown that the DSSAT v3.5 model simulations of the soil water balance components such as evapotranspiration and soil water content were unsatisfactory while simulations of plant parameters such as leaf area index, yield and phonological stages were simulate to a satisfactory standard. Sensitivity analysis gave results that conform to the current understanding of the soil-plant -atmosphere system. Model outputs such as yield and phonological stages were found to sensitive to weather and cultural practices such as seeding rates, row spacing and cultivar maturity groups. It ha been further investigated that the model could be used for risk assessment in various crop management practices and evaluation of cultural practices. However, before farmers can use DSSAT v3.5, several constraints have to be solved.
Thesis (M.Sc.)-University of Natal, Pietermaritzburg, 2003.
APA, Harvard, Vancouver, ISO, and other styles
26

Durand, Lenard. "An explorative study on the development of a framework for the measurement of performance and trust." Thesis, 2017. http://hdl.handle.net/10500/23272.

Full text
Abstract:
Based on literature, a theoretical model was developed for viable performance consisting of eight constructs whilst the trust model of Martins (2000) was used to measure four organisational trust constructs. Exploratory factor analysis was used to extract the constructs, and structural equation modelling was employed to validate the models against the data. An empirical model for viable performance resulted in a solution with seven constructs and organisational trust with five constructs. The two empirical models were unified into a model of viable performance and trust resulting in a measurement model where all 12 constructs were shown. Significant levels of internal consistency were measured. The resulting measurement model was tested for group differences, and no significant differences were found, indicating that the assessment can be used across different groups. It was concluded that the aim to construct and test an integrated and comprehensive theoretical framework of viable performance and trust was achieved and the resulting Viable Performances and Trust Indicator (VPTI) was validated as an assessment to be used across groups. Organisations can thus use the framework and VPTI assessment tool with confidence to assess performance and trust across different biographical groups. Future researchers can build on this exploratory study to refine the scales and apply the measurement model within the wider context of South Africa or as a globally accepted model.
IndustriaI and Organisational Psychology
D. Litt. et Phil. (IndustriaI and Organisational Psychology)
APA, Harvard, Vancouver, ISO, and other styles
27

Woods, Juliette. "Numerical Accuracy of Variable-Density Groundwater Flow and Solute Transport Simulations." 2004. http://hdl.handle.net/2440/37924.

Full text
Abstract:
The movement of a fluid and solute through a porous medium is of great practical interest because this describes the spread of contaminants through an aquifer. Many contaminants occur at concentrations sufficient to alter the density of the fluid, in which case the physics is typically modelled mathematically by a pair of coupled, nonlinear partial differential equations. There is disagreement as to the exact form of these governing equations. Codes aiming to solve some version of the governing equations are typically tested against the Henry and Elder benchmark problems. Neither benchmark has an analytic solution, so in practice they are treated as exercises in inter code comparison. Different code developers define the boundary conditions of the Henry problem differently, and the Elder problems results are poorly understood. The Henry, Elder and some other problems are simulated on several different codes, which produce widely-varying results. The existing benchmarks are unable to distinguish which code, if any, simulates the problems correctly, illustrating the benchmarks' limitations. To determine whether these discrepancies might be due to numerical error, one popular code, SUTRA, is considered in detail. A numerical analysis of a special case reveals that SUTRA is numerically dispersive. This is confirmed using the Gauss pulse test, a benchmark that does have an analytic solution. To further explain inter code discrepancies, a testcode is developed which allows a choice of numerical methods. Some of the methods are based on SUTRA's while others are finite difference methods of varying levels of accuracy. Simulations of the Elder problem reveal that the benchmark is extremely sensitive to the choice of solution method: qualitative differences are seen in the flow patterns. Finally, the impact of numerical error on a real-world application, the simulation of saline disposals, is considered. Saline disposal basins are used to store saline water away from rivers and agricultural land in parts of Australia. Existing models of disposal basins are assessed in terms of their resemblance to real fieldsite conditions, and in terms of numerical error. This leads to the development of a new model which aims to combine verisimilitude with numerical accuracy.
Thesis (Ph.D.)--School of Mathematical Sciences (Applied Mathematics), 2004.
APA, Harvard, Vancouver, ISO, and other styles
28

Haas, Derek Anderson 1981. "Production of [beta-gamma] coincidence spectra of individual radioxenon isotopes for improved analysis of nuclear explosion monitoring data." 2008. http://hdl.handle.net/2152/18097.

Full text
Abstract:
Radioactive xenon gas is a fission product released in the detonation of nuclear devices that can be detected in atmospheric samples far from the detonation site. In order to improve the capabilities of radioxenon detection systems, this work produces [beta-gamma] coincidence spectra of individual isotopes of radioxenon. Previous methods of radioxenon production consisted of the removal of mixed isotope samples of radioxenon gas released from fission of contained fissile materials such as ²³⁵U. In order to produce individual samples of the gas, isotopically enriched stable xenon gas is irradiated with neutrons. The detection of the individual isotopes is also modeled using Monte Carlo simulations to produce spectra. The experiment shows that samples of [superscript 131m]Xe, ¹³³Xe, and ¹³⁵Xe with a purity greater than 99% can be produced, and that a sample of [superscript 133m]Xe can be produced with a relatively low amount of ¹³³Xe background. These spectra are compared to models and used as essential library data for the Spectral Deconvolution Analysis Tool (SDAT) to analyze atmospheric samples of radioxenon for evidence of nuclear events.
text
APA, Harvard, Vancouver, ISO, and other styles
29

Pohl, Martha Jacoba. "A total quality management (TQM) strategic measurement perspective with specific reference to the software industry." Thesis, 1996. http://hdl.handle.net/10500/15796.

Full text
Abstract:
The dissertation aims to obtain an integrated and comprehensive perspective on measurement issues that play a strategic role in organisations that aim at continuous quality improvement through TQM. The multidimensional definition of quality is proposed to view quality holistically. The definition is dynamic, thus dimensions are subject to evolution. Measurement of the quality dimensions is investigated. The relationship between quality and cost, productivity and profitability respectively is examined. The product quality dimensions are redefined for processes. Measurement is a strategic component ofTQM. Integration of financial measures with supplier-; customer-; performance- and internal process measurement is essential for synergism. Measurement of quality management is an additional strategic quality dimension. Applicable research was integrated. Quantitative structures used successfully in industry to achieve quality improvement is important, thus the quality management maturity grid, cleanroom software engineering, software factories, quality function deployment, benchmarking and the ISO 9000 standards are briefly described. Software Metrics Programs are considered to be an application of a holistic measurement approach to quality. Two practical approaches are identified. A framework for initiating implementation is proposed. Two strategic software measurement issues are reliability and cost estimation. Software reliability measurement and modelling are introduced. A strategic approach to software cost estimation is suggested. The critical role of data collection is emphasized. Different approaches to implement software cost estimation in organisations are proposed. A total installed cost template as the ultimate goal is envisaged. An overview of selected software cost estimation models is provided. Potential research areas are identified. The linearity/nonlinearity nature of the software production function is analysed. The synergy between software cost estimation models and project management techniques is investigated. The quantification aspects of uncertainty in activity durations, pertaining to project scheduling, are discussed. Statistical distributions for activity durations are reviewed and compared. A structural view of criteria determining activity duration distribution selection is provided. Estimation issues are reviewed. The integration of knowledge from dispersed fields leads to new dimensions of interaction. Research and practical experience regarding software metrics and software metrics programs can be successfully applied to address the measurement of strategic indicators in other industries.
Business Management
D. Phil. (Operations Research)
APA, Harvard, Vancouver, ISO, and other styles
30

(5930126), Daniel J. Pederson. "Determining, Treating, and Preventing Mechanisms of Sudden Death in Epilepsy using Medical Implantable Devices." Thesis, 2019.

Find full text
Abstract:

People with epilepsy have an increased risk of mortality when compared to the general population. These increased mortality risks include deaths related to status epilepticus and sudden unexpected death in epilepsy (SUDEP). Physiological data describing cardiac, respiratory, and brain function prior to sudden death in epilepsy is crucial to the studying the underlying mechanisms behind these deaths. Because it is unknown when sudden deaths in epilepsy may occur, continuous monitoring is necessary to guarantee the capture of physiological data prior to death.

I have used custom designed implantable devices to continuously measure cardiac, respiratory, and neurological signals in freely behaving rats with chronically induced epilepsy. Due to the continuous respiration measurements, the resultant dataset is the first of its kind. This dataset indicates that respiratory abnormalities (reduced respiration and short apneas) occur during and after seizures. These abnormalities may indicate SUDEP onset because obstructive apneas due to laryngospasm have been indicated as possible causes of SUDEP in other studies.

Laryngospasms can be caused by gastric acid coming into contact with the larynx. During a laryngospasm, intrinsic laryngeal muscles contract, resulting in the closure of the airway. Recently published research has indicated that acid reflux may be responsible for triggering fatal laryngospasms in rats with induced seizures. I have found that the larynx can be opened during a laryngospasm by electrically stimulating the recurrent laryngeal nerves. I have also found that performing gastric vagotomies leads to a statistically significant reduction in mortality due to fatal apneas in rats with induced seizures.

APA, Harvard, Vancouver, ISO, and other styles
31

Kirkby, S. D. (Stephen Denis). "Managing dryland salinisation with an integrated expert system/geographic information system / S.D. Kirkby." 1994. http://hdl.handle.net/2440/21517.

Full text
Abstract:
Bibliography: leaves 119-218.
xiv, 218 leaves : ill. (some col.), maps (some col.) ; 30 cm.
Title page, contents and abstract only. The complete thesis in print form is available from the University Library.
Salt Manager represents the software system developed by this thesis to implement an interactive land classification methodology. An Expert System (ES), a Geographic Information System (GIS), remotely sensed information and a relational database management system (RDBMS) have been utilised to construct the methodology.
Thesis (Ph.D.)--University of Adelaide, Dept. of Geography, 1995
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography