To see the other types of publications on this topic, follow the link: Minimum level of haemoglobin.

Dissertations / Theses on the topic 'Minimum level of haemoglobin'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 39 dissertations / theses for your research on the topic 'Minimum level of haemoglobin.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Potgieter, Ryno. "Minimum sample size for estimating the Bayes error at a predetermined level." Diss., University of Pretoria, 2013. http://hdl.handle.net/2263/33479.

Full text
Abstract:
Determining the correct sample size is of utmost importance in study design. Large samples yield classifiers or parameters with more precision and conversely, samples that are too small yield unreliable results. Fixed sample size methods, as determined by the specified level of error between the obtained parameter and population value, or a confidence level associated with the estimate, have been developed and are available. These methods are extremely useful when there is little or no cost (consequences of action), financial and time, involved in gathering the data. Alternatively, sequential sampling procedures have been developed specifically to obtain a classifier or parameter estimate that is as accurate as deemed necessary by the researcher, while sampling the least number of observations required to obtain the specified level of accuracy. This dissertation discusses a sequential procedure, derived using Martingale Limit Theory, which had been developed to train a classifier with the minimum number of observations to ensure, with a high enough probability, that the next observation sampled has a low enough probability of being misclassified. Various classification methods are discussed and tested, with multiple combinations of parameters tested. Additionally, the sequential procedure is tested on microarray data. Various advantages and shortcomings of the sequential procedure are pointed out and discussed. This dissertation also proposes a new sequential procedure that trains the classifier to such an extent as to accurately estimate the Bayes error with a high probability. The sequential procedure retains all of the advantages of the previous method, while addressing the most serious shortcoming. Ultimately, the sequential procedure developed enables the researcher to dictate how accurate the classifier should be and provides more control over the trained classifier.<br>Dissertation (MSc)--University of Pretoria, 2013.<br>Statistics<br>Unrestricted
APA, Harvard, Vancouver, ISO, and other styles
2

Thornton, Katherine C. (Katherine Claire). "Minimum carbon tax level needed to prompt a widespread shift to nuclear power." Thesis, Massachusetts Institute of Technology, 2007. http://hdl.handle.net/1721.1/41687.

Full text
Abstract:
Thesis (S.B.)--Massachusetts Institute of Technology, Dept. of Nuclear Science and Engineering, 2007.<br>Includes bibliographical references (leaves 45-50).<br>Carbon dioxide is suspected to be a major contributor to global warming. In the United States, nearly 70% of electricity is produced using coal or natural gas, both of which emit carbon dioxide into the environment. Nuclear power, which does not emit any carbon dioxide, produces 17% of the electricity consumed in the United States. In order to persuade utilities to switch from coal or natural gas to nuclear power and thus reduce carbon dioxide emissions, a carbon tax should be implemented. Depending on the cost of construction for new nuclear plants and the level of savings that will incentivize utilities to switch, the carbon tax needed to promote nuclear power will range between $20/tC and $200/tC. The full range of carbon tax scenarios are developed in this thesis, with the most likely carbon tax being $1 10/tC. This cost assumes a $1800/kw capital construction cost and a 10% risk perception premium on the bus bar cost of power to address the financial and industry community's somewhat negative perception of nuclear investments. From a policy perspective, this carbon tax will be more effective in causing utilities to move to nuclear power than a cap and trade policy. From a utility standpoint, switching to nuclear power under a carbon tax is less expensive than implementing a program of carbon capture and sequestration.<br>by Katherine C. Thornton.<br>S.B.
APA, Harvard, Vancouver, ISO, and other styles
3

Fararouei, Mohammad. "Maternal haemoglobin level and vitamin D supplementation during infancy in the Northern Finland 1966 Birth Cohort : effects on growth and development." Thesis, Imperial College London, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.446553.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ronnie, Roger. "Why are there so few minimum service level agreements? A case study of a metropolitan municipality." Master's thesis, Faculty of Law, 2019. http://hdl.handle.net/11427/30810.

Full text
Abstract:
In terms of the South African Constitution, every worker has the right to strike. This right is regulated in the Labour Relations Act. Workers engaged in essential services are prohibited from striking. The prohibition does not apply if a minimum service level agreement, guaranteeing services in the event of a strike, has been concluded between employers and trade unions. The Essential Services Committee, established under the Labour Relations Act ( the Act), must ratify these agreements before they become effective. More than two decades after the LRA was promulgated, very few ratified minimum service level agreements have been concluded in the municipal sector. This study explores the reasons for this and suggests legislative and policy interventions that could be considered on a sector wide basis. The study is by way of a single-case study of a metropolitan municipality. Data were obtained from two sources: 14 semi-structured interviews with participants and from an analysis of documents relevant to the regulation of essential services. The study established that the legislative framework for regulating essential services in South Africa is consistent with the principles and decisions laid down by the International Labour Organisation. It however does not provide guidelines for determining minimum service levels. An apparent unevenness between the representatives of the negotiating counterparts exists in the municipal sector in South Africa. Many of the party representatives negotiating minimum service levels, do not work in designated essential services or possess relevant technical skills. The findings of the study suggest steps that could be taken to strengthen the capacity of the Essential Services Committee to assist parties in the municipal sector to conclude minimum service agreements and build the negotiating capacity of the parties. The study also makes recommendations regarding improved participation by essential service workers and the broader community in the process.
APA, Harvard, Vancouver, ISO, and other styles
5

Pogorelcnik, Romain. "Decomposition by complete minimum separators and applications." Thesis, Clermont-Ferrand 2, 2012. http://www.theses.fr/2012CLF22301/document.

Full text
Abstract:
Nous avons utilisé la décomposition par séparateurs minimaux complets. Pour décomposer un graphe G, il est nécessaire de trouver les séparateurs minimaux dans le graphe triangulé H correspondant. Dans ce contexte, nos premiers efforts se sont tournés vers la détection de séparateurs minimaux dans un graphe triangulé. Nous avons défini une structure, que nous avons nommée 'atom tree'. Cette dernière est inspirée du 'clique tree' et permet d'obtenir et de représenter les atomes qui sont les produits de la décomposition. Lors de la manipulation de données à l'aide de treillis de Galois, nous avons remarqué que la décomposition par séparateurs minimaux permettait une approche de type `Diviser pour régner' pour les treillis de Galois. La détection des gènes fusionnés, qui est une étape importante pour la compréhension de l'évolution des espèces, nous a permis d'appliquer nos algorithmes de détection de séparateurs minimaux complets, qui nous a permis de détecter et regrouper de manière efficace les gènes fusionnés. Une autre application biologique fut la détection de familles de gènes d'intérêts à partir de données de niveaux d'expression de gènes. La structure de `l'atom tree' nous a permis d'avoir un bon outils de visualisation et de gérer des volumes de données importantes<br>We worked on clique minimal separator decomposition. In order to compute this decomposition on a graph G we need to compute the minimal separators of its triangulation H. In this context, the first efforts were on finding a clique minimal separators in a chordal graph. We defined a structure called atom tree inspired from the clique tree to compute and represent the final products of the decomposition, called atoms. The purpose of this thesis was to apply this technique on biological data. While we were manipulating this data using Galois lattices, we noticed that the clique minimal separator decomposition allows a divide and conquer approach on Galois lattices. One biological application of this thesis was the detection of fused genes which are important evolutionary events. Using algorithms we produced in the course of along our work we implemented a program called MosaicFinder that allows an efficient detection of this fusion event and their pooling. Another biological application was the extraction of genes of interest using expression level data. The atom tree structure allowed us to have a good visualization of the data and to be able to compute large datasets
APA, Harvard, Vancouver, ISO, and other styles
6

Lombard, Alex. "Second-Strike Nuclear Forces and Neorealist Theory: Unit-Level Challenge or Balance-of-Power Politics as Usual?" Thesis, Department of Government and International Relations, 2007. http://hdl.handle.net/2123/2158.

Full text
Abstract:
ABSTRACT: What are the implications of second-strike nuclear forces for neorealism? The end of the Cold War yielded a unipolar structure of international politics defined by the military, economic, and political preponderance of the United States. According to balance-of-power theory, which lies at the heart of neorealism, unipolarity has a short life span as secondary states waste little time in rectifying the global imbalance of power. Thus far, America remains unbalanced. Are we to take this as a refutation of balance-of-power theory? My thesis argues that second-strike arsenals render void the need to balance superior American military power. But because state survival is contingent not only upon military invulnerability (for which nuclear weapons are a sure guarantee), but also upon economic invulnerability (for which there is no absolute remedy), nuclear-weapon states are impelled to balance superior economic power for security reasons. By recasting balance-of-power theory in light of these assumptions, one can make sense of the great-power politics of the post-Cold War era.<br>N/A<br>Department of Government and International Relations
APA, Harvard, Vancouver, ISO, and other styles
7

Dong, Wenfang. "ATC constraints and modelling in global ATM environment." Thesis, Cranfield University, 2011. http://dspace.lib.cranfield.ac.uk/handle/1826/5637.

Full text
Abstract:
The United Kingdom’s Civil Aviation Authority published the national aviation forecast in 2008. The forecast predicts that domestic traffic will increase by 3.5% per year, and that international traffic will grow, on average, by 4.5% during 2010-2020. Based on this prediction, the traffic density will increase dramatically in the future, and airspace will be more and more congested. Usually, there are two potential solutions to deal with this situation: improving the ability of air traffic flow management is one solution; reducing the separation minimum of aircraft is another solution. However, this thesis focuses on the second solution, based on constraints of communication, navigation and surveillance systems (CNS). Cont?d.
APA, Harvard, Vancouver, ISO, and other styles
8

Bragg, John M. (John Morris) 1949. "The Effect of Remediation on Students Who Have Failed the TEAMS Minimum Competency Test." Thesis, University of North Texas, 1988. https://digital.library.unt.edu/ark:/67531/metadc330810/.

Full text
Abstract:
This qualitative case study provided a narrative portrait of 12 students in the 11th grade in one north Texas district who failed the initial administration of the Texas Educational Assessment of Minimum Skills (TEAMS) exit-level test. It also presented an account of their perceptions of the test and their efforts to overcome this educational hurdle. The following conclusions were drawn from the study. Limited English proficiency (LEP) students had difficulty mastering the language arts section of the test. A majority of the students reported that TEAMS failure had no social impact. Most of the students declined district-offered remediation. Students tended to perceive the test as a personal challenge. Those students who attended remedial tutoring sessions performed better on the following retest than those who declined remediation. Hispanic and Asian students expressed additional study as being the key to passing the test. Black students felt that the key to passing was to spend sufficient time while taking the test. Those students who were more verbal during their interviews tended to be more successul in passing the language arts section of the TEAMS. The following recommendations were made from the study: (a) students who fail the TEAMS by minimal margins should be encouraged to take remediation; (b) an intensive remedial English course for LEP students should be offered; (c) "high interest" TEAMS mini-lessons should be presented daily for several weeks as a lead-up to the TEAMS; (d) a TEAMS ex it-level orientation program which stresses the importance of the test for the student's future should be implemented; and (e) additional research should be conducted on older students' verbal responses to see if a rich language approach in English classes including listening, reading, writing, and speaking will develop higher level language skills.
APA, Harvard, Vancouver, ISO, and other styles
9

Avery-Gomm, Stephanie. "Determining the impacts of hydrological drought on endangered Nooksack dace (Rhinichthys cataractae) at the population and individual level : Implications for minimum environmental flow requirements." Thesis, University of British Columbia, 2013. http://hdl.handle.net/2429/44408.

Full text
Abstract:
Understanding the impacts of hydrological drought is crucial to the conservation of freshwater fishes. In British Columbia, Nooksack dace (Cyprinidae: Rhinichthys cataractae) are an endangered riffle specialist and are threatened by extremely low summer flows. The purpose of this thesis was to explore the impacts of drought on Nooksack dace, whether pool habitats may act as refugia to mitigate these impacts, and to define minimum environmental flow requirements. The first two objectives were addressed using a combination of field survey and experimental manipulations. A reduction in Nooksack dace population size with declining summer flow in Bertrand Creek, and a marked decrease in growth at low discharge in experimental riffles, indicated that low discharge has negative impacts on dace at both population and individual levels. Pool habitats were found to play a minor role in mitigating the negative impacts of hydrological drought (e.g., decreased growth rate), save as a refuge from stranding when riffles dewater. The third objective was addressed using the Instream Flow Incremental Methodology (IFIM). Because this study involved an endangered species an emphasis was placed on evaluating two fundamental assumptions of the methodology. Experimental results for Nooksack dace growth at different depths and velocities provided support for the first assumption, that density-based Habitat Suitability Curves (HSCs) accurately reflect habitat quality, but only for the lower limits of the HSCs. Next, a significant positive relationship between Weighted Usable Area (WUA) and dace biomass was found, supporting the assumption that such a relationship exists. However, this relationship was weak indicating a high degree of uncertainty in how Nooksack dace biomass will respond at high discharges. The IFIM model predicted that habitat availability for Nooksack dace begins to decline most rapidly at discharges of 0.12 m³.s-¹. As there is low confidence in upper ranges of the HSCs this low flow threshold may underestimate declines with discharge, and therefore protection of at least 0.12 m³.s-¹ is considered necessary for the persistence of Nooksack dace individuals and populations. Compared to conventional instream flow criteria, 0.12 m³.s-¹ represents ~10% mean annual discharge which is the threshold for severely degraded habitat.
APA, Harvard, Vancouver, ISO, and other styles
10

Tanniche, Imen. "Correlating antisense RNA performance with thermodynamic calculations." Thesis, Virginia Tech, 2013. http://hdl.handle.net/10919/49698.

Full text
Abstract:
Antisense RNA (asRNA) strategies are identified as an effective and specific method for gene down-regulation at the post-transcriptional level. In this study, the major purpose is to find a correlation between the expression level and minimum free energy to enable the design of specific asRNA fragments. The thermodynamics of asRNA and mRNA hybridization were computed based on the fluorescent protein reporter genes. Three different fluorescent proteins (i) green fluorescent protein (GFP), (ii) cyan fluorescent protein (CFP) and (iii) yellow fluorescent protein (YFP) were used as reporters. Each fluorescent protein was cloned into the common pUC19 vector. The asRNA fragments were randomly amplified and the resulted antisense DNA fragments were inserted into the constructed plasmid under the control of an additional inducible plac promoter and terminator. The expression levels of fluorescent reporter protein were determined in real time by plate reader. Different results have been observed according to the fluorescent protein and the antisense fragment sequence. The CFP expression level was decreased by 50 to 78% compared to the control. However, with the GFP, the down-regulation did not exceed 30% for the different constructs used. For certain constructs, the effect was the opposite of expected and the expression level was increased. In addition, the YFP showed a weak signal compared to growth media, therefore the expression level was hard to be defined. Based on these results, a thermodynamic model to describe the relationship between the particular asRNA used and the observed expression level of the fluorescent reporter was developed. The minimum free energy and binding percentage of asRNA-mRNA complex were computed by NUPACK software. The expression level was drawn as a function of the minimum free energy. The results showed a weak correlation, but linear trends were observed for low energy values and low expression levels the CFP gene. The linear aspect is not verified for higher energy values. These findings suggest that the lower the energy is, the more stable is the complex asRNA-mRNA and therefore more reduction of the expression is obtained. Meanwhile, the non-linearity involves that there are other parameters to be investigated to improve the mathematical correlation. This model is expected to offer the chance to "fine-tune" asRNA effectiveness and subsequently modulate gene expression and redirect metabolic pathways toward the desired component. In addition, the investigation of the localization of antisense binding indicates that there are some regions that favors the hybridization and promote hence the down-regulation mechanisms.<br>Master of Science
APA, Harvard, Vancouver, ISO, and other styles
11

Andersson, Erica, and Ida Knutsson. "Immigration - Benefit or harm for native-born workers?" Thesis, Linnéuniversitetet, Institutionen för nationalekonomi och statistik (NS), 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-53829.

Full text
Abstract:
The aim of our study is to investigate the effect of immigrants on wages for natives with divergent skill level within one country. Skill level is measured as education level and the purpose is to focus on the level where it according to us is a lack in research, namely the effect on high skilled native-born worker wages. Further, our contribution to the already existing studies may be considered to be a complement. Using panel data, collected from the time period 2000-2008 for the 290 municipalities in Sweden to get regional variation, we investigate and interpret the estimated outcome of how wages for native-born workers in the Swedish labor market respond to immigration into Sweden. The main findings, when controlling for age, unemployment, and differences between year and municipalities in this study are on the short run, in line with the theory. The closer to a substitute the native-born and foreign-born workers are, the greater are the adverse effect on the wage for native-born, given that we assume immigrants as low skilled. The effect on wage for high skilled native workers in short run, when assuming immigrants and natives as complement, is positive, i.e. the wage for high skilled natives increases as the share of immigrants increases. The effect on high skilled native-born wages is positive even in mid-long run and adverse for the low and medium skilled native-workers. This is not an expected outcome since we according to theory predict the wage to be unaffected in mid-long run. This may be the result of errors in the assumption that immigrants are low skilled, or that five years is a too short time to see the expected effect in the long run; the Swedish labor market may need more time to adjust to what we predict the outcome to be.
APA, Harvard, Vancouver, ISO, and other styles
12

Gao, Jie. "Lake stage fluctuation study in West-Central Florida using multiple regression models." [Tampa, Fla.] : University of South Florida, 2004. http://purl.fcla.edu/fcla/etd/SFE0000502.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Barone, Anthony J. "State Level Earned Income Tax Credit’s Effects on Race and Age: An Effective Poverty Reduction Policy." Scholarship @ Claremont, 2013. http://scholarship.claremont.edu/cmc_theses/771.

Full text
Abstract:
In this paper, I analyze the effectiveness of state level Earned Income Tax Credit programs on improving of poverty levels. I conducted this analysis for the years 1991 through 2011 using a panel data model with fixed effects. The main independent variables of interest were the state and federal EITC rates, minimum wage, gross state product, population, and unemployment all by state. I determined increases to the state EITC rates provided only a slight decrease to both the overall white below-poverty population and the corresponding white childhood population under 18, while both the overall and the under-18 black population for this category realized moderate decreases in their poverty rates for the same time period. I also provide a comparison of the effectiveness of the state level EITCs and minimum wage at the state level over the same time period on these select demographic groups.
APA, Harvard, Vancouver, ISO, and other styles
14

Золотіна, А. В. "Право громадян України на гідний рівень життя – проблеми реалізації". Thesis, Українська академія банківської справи Національного банку України, 2006. http://essuir.sumdu.edu.ua/handle/123456789/60512.

Full text
Abstract:
Сьогодні всі ми сподіваємось, що після закінчення ВУЗу на нас чекають безліч банків, підприємств, організацій. Ми вже плануємо за- робляти достатньо коштів, щоб можна було задовольнити всі свої по- треби, втілити свої мрії у життя. Але, на жаль, це буде далеко не в ко- жного з нас, і це факт. “Але як же ст. 48 Конституції України?” – за- уважите Ви, згідно з якою кожен має право на достатній життєвий рі- вень для себе і своєї сім’ї, що включає достатнє харчування, одяг, жи- тло. Але я одразу ж запитаю у Вас, яким чином всього цього може до- сягнути людина?! Відповідь очевидна – працюючи.
APA, Harvard, Vancouver, ISO, and other styles
15

Webster, Ronald A. "Development of statistical methods for the surveillance and monitoring of adverse events which adjust for differing patient and surgical risks." Thesis, Queensland University of Technology, 2008. https://eprints.qut.edu.au/16622/1/Ronald_Albert_Webster_Thesis.pdf.

Full text
Abstract:
The research in this thesis has been undertaken to develop statistical tools for monitoring adverse events in hospitals that adjust for varying patient risk. The studies involved a detailed literature review of risk adjustment scores for patient mortality following cardiac surgery, comparison of institutional performance, the performance of risk adjusted CUSUM schemes for varying risk profiles of the populations being monitored, the effects of uncertainty in the estimates of expected probabilities of mortality on performance of risk adjusted CUSUM schemes, and the instability of the estimated average run lengths of risk adjusted CUSUM schemes found using the Markov chain approach. The literature review of cardiac surgical risk found that the number of risk factors in a risk model and its discriminating ability were independent, the risk factors could be classified into their "dimensions of risk", and a risk score could not be generalized to populations remote from its developmental database if accurate predictions of patients' probabilities of mortality were required. The conclusions were that an institution could use an "off the shelf" risk score, provided it was recalibrated, or it could construct a customized risk score with risk factors that provide at least one measure for each dimension of risk. The use of report cards to publish adverse outcomes as a tool for quality improvement has been criticized in the medical literature. An analysis of the report cards for cardiac surgery in New York State showed that the institutions' outcome rates appeared overdispersed compared to the model used to construct confidence intervals, and the uncertainty associated with the estimation of institutions' out come rates could be mitigated with trend analysis. A second analysis of the mortality of patients admitted to coronary care units demonstrated the use of notched box plots, fixed and random effect models, and risk adjusted CUSUM schemes as tools to identify outlying hospitals. An important finding from the literature review was that the primary reason for publication of outcomes is to ensure that health care institutions are accountable for the services they provide. A detailed review of the risk adjusted CUSUM scheme was undertaken and the use of average run lengths (ARLs) to assess the scheme, as the risk profile of the population being monitored changes, was justified. The ARLs for in-control and out-of-control processes were found to increase markedly as the average outcome rate of the patient population decreased towards zero. A modification of the risk adjusted CUSUM scheme, where the step size for in-control to out-of-control outcome probabilities were constrained to no less than 0.05, was proposed. The ARLs of this "minimum effect" CUSUM scheme were found to be stable. The previous assessment of the risk adjusted CUSUM scheme assumed that the predicted probability of a patient's mortality is known. A study of its performance, where the estimates of the expected probability of patient mortality were uncertain, showed that uncertainty at the patient level did not affect the performance of the CUSUM schemes, provided that the risk score was well calibrated. Uncertainty in the calibration of the risk model appeared to cause considerable variation in the ARL performance measures. The ARLs of the risk adjusted CUSUM schemes were approximated using simulation because the approximation method using the Markov chain property of CUSUMs, as proposed by Steiner et al. (2000), gave unstable results. The cause of the instability was the method of computing the Markov chain transition probabilities, where probability is concentrated at the midpoint of its Markov state. If probability was assumed to be uniformly distributed over each Markov state, the ARLs were stabilized, provided that the scores for the patients' risk of adverse outcomes were discrete and finite.
APA, Harvard, Vancouver, ISO, and other styles
16

Webster, Ronald A. "Development of statistical methods for the surveillance and monitoring of adverse events which adjust for differing patient and surgical risks." Queensland University of Technology, 2008. http://eprints.qut.edu.au/16622/.

Full text
Abstract:
The research in this thesis has been undertaken to develop statistical tools for monitoring adverse events in hospitals that adjust for varying patient risk. The studies involved a detailed literature review of risk adjustment scores for patient mortality following cardiac surgery, comparison of institutional performance, the performance of risk adjusted CUSUM schemes for varying risk profiles of the populations being monitored, the effects of uncertainty in the estimates of expected probabilities of mortality on performance of risk adjusted CUSUM schemes, and the instability of the estimated average run lengths of risk adjusted CUSUM schemes found using the Markov chain approach. The literature review of cardiac surgical risk found that the number of risk factors in a risk model and its discriminating ability were independent, the risk factors could be classified into their "dimensions of risk", and a risk score could not be generalized to populations remote from its developmental database if accurate predictions of patients' probabilities of mortality were required. The conclusions were that an institution could use an "off the shelf" risk score, provided it was recalibrated, or it could construct a customized risk score with risk factors that provide at least one measure for each dimension of risk. The use of report cards to publish adverse outcomes as a tool for quality improvement has been criticized in the medical literature. An analysis of the report cards for cardiac surgery in New York State showed that the institutions' outcome rates appeared overdispersed compared to the model used to construct confidence intervals, and the uncertainty associated with the estimation of institutions' out come rates could be mitigated with trend analysis. A second analysis of the mortality of patients admitted to coronary care units demonstrated the use of notched box plots, fixed and random effect models, and risk adjusted CUSUM schemes as tools to identify outlying hospitals. An important finding from the literature review was that the primary reason for publication of outcomes is to ensure that health care institutions are accountable for the services they provide. A detailed review of the risk adjusted CUSUM scheme was undertaken and the use of average run lengths (ARLs) to assess the scheme, as the risk profile of the population being monitored changes, was justified. The ARLs for in-control and out-of-control processes were found to increase markedly as the average outcome rate of the patient population decreased towards zero. A modification of the risk adjusted CUSUM scheme, where the step size for in-control to out-of-control outcome probabilities were constrained to no less than 0.05, was proposed. The ARLs of this "minimum effect" CUSUM scheme were found to be stable. The previous assessment of the risk adjusted CUSUM scheme assumed that the predicted probability of a patient's mortality is known. A study of its performance, where the estimates of the expected probability of patient mortality were uncertain, showed that uncertainty at the patient level did not affect the performance of the CUSUM schemes, provided that the risk score was well calibrated. Uncertainty in the calibration of the risk model appeared to cause considerable variation in the ARL performance measures. The ARLs of the risk adjusted CUSUM schemes were approximated using simulation because the approximation method using the Markov chain property of CUSUMs, as proposed by Steiner et al. (2000), gave unstable results. The cause of the instability was the method of computing the Markov chain transition probabilities, where probability is concentrated at the midpoint of its Markov state. If probability was assumed to be uniformly distributed over each Markov state, the ARLs were stabilized, provided that the scores for the patients' risk of adverse outcomes were discrete and finite.
APA, Harvard, Vancouver, ISO, and other styles
17

Ben, Salem Aymen. "The Application of Multiuser Detection to Spectrally Efficient MIMO or Virtual MIMO SC-FDMA Uplinks in LTE Systems." Thèse, Université d'Ottawa / University of Ottawa, 2013. http://hdl.handle.net/10393/30351.

Full text
Abstract:
Single Carrier Frequency Division Multiple Access (SC-FDMA) is a multiple access transmission scheme that has been adopted in the 4th generation 3GPP Long Term Evolution (LTE) of cellular systems. In fact, its relatively low peak-to-average power ratio (PAPR) makes it ideal for the uplink transmission where the transmit power efficiency is of paramount importance. Multiple access among users is made possible by assigning different users to different sets of non-overlapping subcarriers. With the current LTE specifications, if an SC-FDMA system is operating at its full capacity and a new user requests channel access, the system redistributes the subcarriers in such a way that it can accommodate all of the users. Having less subcarriers for transmission, every user has to increase its modulation order (for example from QPSK to 16QAM) in order to keep the same transmission rate. However, increasing the modulation order is not always possible in practice and may introduce considerable complexity to the system. The technique presented in this thesis report describes a new way of adding more users to an SC-FDMA system by assigning the same sets of subcarriers to different users. The main advantage of this technique is that it allows the system to accommodate more users than conventional SC-FDMA and this corresponds to increasing the spectral efficiency without requiring a higher modulation order or using more bandwidth. During this work, special attentions wee paid to the cases where two and three source signals are being transmitted on the same set of subcarriers, which leads respectively to doubling and tripling the spectral efficiency. Simulation results show that by using the proposed technique, it is possible to add more users to any SC-FDMA system without increasing the bandwidth or the modulation order while keeping the same performance in terms of bit error rate (BER) as the conventional SC-FDMA. This is realized by slightly increasing the energy per bit to noise power spectral density ratio (Eb/N0) at the transmitters.
APA, Harvard, Vancouver, ISO, and other styles
18

CHI-TIEN, SU, and 蘇啟天. "Minimum aberrration and estimaiton index in three-level design." Thesis, 2005. http://ndltd.ncl.edu.tw/handle/42160069008266366906.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Liu, Chih-Yen, and 劉致彥. "2^(K-P)design minimum total level change analysis." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/7ep8c8.

Full text
Abstract:
碩士<br>國立交通大學<br>統計學研究所<br>107<br>2^(K-P) Design is a problem based on 2^K Design in Design of experiment. Peng and Lin(2018) also provide a method in their paper, which construct a full connect graph to represent those experiments and change this problem into a graph theory problem. After reforming the problem, Lin applied it to Concorde (Applegate et al.2001), a Traveling salesman problem(TSP) solver, and get its best solution. Since the graph will be insanely huge when k-p get larger enough. We propose a different method, block by generator method, to solve this problem. Result and analysis are introduced.
APA, Harvard, Vancouver, ISO, and other styles
20

Khan, Sohail Razi. "MSL Framework: (Minimum Service Level Framework) for cloud providers and users." Doctoral thesis, 2018. http://hdl.handle.net/10284/7120.

Full text
Abstract:
Cloud Computing ensures parallel computing and emerged as an efficient technology to meet the challenges of rapid growth of data that we experienced in this Internet age. Cloud computing is an emerging technology that offers subscription based services, and provide different models such as IaaS, PaaS and SaaS among other models to cater the needs of different user groups. The technology has enormous benefits but there are serious concerns and challenges related to lack of uniform standards or nonexistence of minimum benchmark for level of services offered across the industry to provide an effective, uniform and reliable service to the cloud users. As the cloud computing is gaining popularity, organizations and users are having problems to adopt the service ue to lack of minimum service level framework which can act as a benchmark in the selection of the cloud provider and provide quality of service according to the user’s expectations. The situation becomes more critical due to distributed nature of the service provider which can be offering service from any part of the world. Due to lack of minimum service level framework that will act as a benchmark to provide a uniform service across the industry there are serious concerns raised recently interms of security and data privacy breaches, authentication and authorization issues, lack of third party audit and identity management problems, integrity, confidentiality and variable data availability standards, no uniform incident response and monitoring standards, interoperability and lack of portability standards, identity management issues, lack of infrastructure protection services standards and weak governance and compliance standards are major cause of concerns for cloud users. Due to confusion and absence of universal agreed SLAs for a service model, different quality of services is being provided across the cloud industry. Currently there is no uniform performance model agreed by all stakeholders; which can provide performance criteria to measure, evaluate, and benchmark the level of services offered by various cloud providers in the industry. With the implementation of General Data Protection Regulation (GDPR) and demand from cloud users to have Green SLAs that provides better resource allocations mechanism, there will be serious implications for the cloud providers and its consumers due to lack of uniformity in SLAs and variable standards of service offered by various cloud providers. This research examines weaknesses in service level agreements offered by various cloud providers and impact due to absence of uniform agreed minimum service level framework on the adoption and usage of cloud service. The research is focused around higher education case study and proposes a conceptual model based on uniform minimum service model that acts as benchmark for the industry to ensure quality of service to the cloud users in the higher education institution and remove the barriers to the adoption of cloud technology. The proposed Minimum Service Level (MSL) framework, provides a set of minimum and uniform standards in the key concern areas raised by the participants of HE institution which are essential to the cloud users and provide a minimum quality benchmark that becomes a uniform standard across the industry. The proposed model produces a cloud computing implementation evaluation criteria which is an attempt to reduce the adoption barrier of the cloud technology and set minimum uniform standards followed by all the cloud providers regardless of their hosting location so that their performance can be measured, evaluated and compared across the industry to improve the overall QoS (Quality of Service) received by the cloud users, remove the adoption barriers and concerns of the cloud users and increase the competition across the cloud industry.<br>A computação em nuvem proporciona a computação paralela e emergiu como uma tecnologia eficiente para enfrentar os desafios do crescimento rápido de dados que vivemos na era da Internet. A computação em nuvem é uma tecnologia emergente que oferece serviços baseados em assinatura e oferece diferentes modelos como IaaS, PaaS e SaaS, entre outros modelos para atender as necessidades de diferentes grupos de utilizadores. A tecnologia tem enormes benefícios, mas subsistem sérias preocupações e desafios relacionados com a falta de normas uniformes ou inexistência de um referencial mínimo para o nível de serviços oferecidos, na indústria, para proporcionar uma oferta eficaz, uniforme e confiável para os utilizadores da nuvem. Como a computação em nuvem está a ganhar popularidade, tanto organizações como utilizadores estão enfrentando problemas para adotar o serviço devido à falta de enquadramento de nível de serviço mínimo que possa agir como um ponto de referência na seleção de provedor da nuvem e fornecer a qualidade dos serviços de acordo com as expectativas do utilizador. A situação torna-se mais crítica, devido à natureza distribuída do prestador de serviço, que pode ser oriundo de qualquer parte do mundo. Devido à falta de enquadramento de nível de serviço mínimo que irá agir como um benchmark para fornecer um serviço uniforme em toda a indústria, existem sérias preocupações levantadas recentemente em termos de violações de segurança e privacidade de dados, autenticação e autorização, falta de questões de auditoria de terceiros e problemas de gestão de identidade, integridade, confidencialidade e disponibilidade de dados, falta de uniformidade de normas, a não resposta a incidentes e o monitoramento de padrões, a interoperabilidade e a falta de padrões de portabilidade, questões relacionadas com a gestão de identidade, falta de padrões de serviços de proteção das infraestruturas e fraca governança e conformidade de padrões constituem outras importantes causas de preocupação para os utilizadores. Devido à confusão e ausência de SLAs acordados de modo universal para um modelo de serviço, diferente qualidade de serviços está a ser fornecida através da nuvem, pela indústria da computação em nuvem. Atualmente, não há desempenho uniforme nem um modelo acordado por todas as partes interessadas; que pode fornecer critérios de desempenho para medir, avaliar e comparar o nível de serviços oferecidos por diversos fornecedores de computação em nuvem na indústria. Com a implementação do Regulamento Geral de Protecção de Dados (RGPD) e a procura da nuvem com base no impacto ambiental (Green SLAs), são acrescentadas precupações adicionais e existem sérias implicações para os forncedores de computação em nuvem e para os seus consumidores, também devido à falta de uniformidade na multiplicidade de SLAs e padrões de serviço oferecidos. A presente pesquisa examina as fraquezas em acordos de nível de serviço oferecidos por fornecedores de computação em nuvem e estuda o impacto da ausência de um quadro de nível de serviço mínimo acordado sobre a adoção e o uso no contexto da computação em nuvem. A pesquisa está orientada para a adoção destes serviços para o caso do ensino superior e as instituições de ensino superior e propõe um modelo conceptualt com base em um modelo de serviço mínimo uniforme que funciona como referência para a indústria, para garantir a qualidade do serviço para os utilizadores da nuvem numa instituição de ensino superior de forma a eliminar as barreiras para a adoção da tecnologia de computação em nuvem. O nível de serviço mínimo proposto (MSL), fornece um conjunto mínimo de normas uniformes e na áreas das principais preocupações levantadas por responsáveis de instituições de ensino superior e que são essenciais, de modo a fornecer um referencial mínimo de qualidade, que se possa tornar um padrão uniforme em toda a indústria. O modelo proposto é uma tentativa de reduzir a barreira de adoção da tecnologia de computação em nuvem e definir normas mínimas seguidas por todos os fornecedores de computação em nuvem, independentemente do seu local de hospedagem para que os seus desempenhos possam ser medidos, avaliados e comparados em toda a indústria, para melhorar a qualidade de serviço (QoS) recebida pelos utilizadores e remova as barreiras de adoção e as preocupações dos utilizadores, bem como fomentar o aumento da concorrência em toda a indústria da computação em nuvem.
APA, Harvard, Vancouver, ISO, and other styles
21

Li, Bing-Hung, and 李秉泓. "A High-Level Synthesis Approach for Minimum-Area Low-Power Gated Clock Designs." Thesis, 2011. http://ndltd.ncl.edu.tw/handle/2xnt39.

Full text
Abstract:
碩士<br>中原大學<br>電子工程研究所<br>99<br>Clock gating is one of useful techniques to reduce the dynamic power consumption of synchronous sequential circuits. To effectively reduce the power consumption of clock tree, previous work has shown that clock control logic should be synthesized in the high-level synthesis stage. However, previous work may suffer from a large circuit area overhead on the clock control logic. In this thesis, we present an ILP (integer linear programming) approach to consider both the clock tree and the clock control logic. Our objective goal is not only to conform to the constraint on the power consumption, but also to minimize the area overhead on clock control logic. Benchmark data consistently show that our approach can greatly reduce the circuit area overhead with a small penalty on the total overall power consumption.
APA, Harvard, Vancouver, ISO, and other styles
22

Hsiao, Ya-Chun, and 蕭雅純. "Two-level Minimum Aberration Fractional Factorial Designs in the Presence of Dispersion Factors." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/92046189624624078614.

Full text
Abstract:
碩士<br>國立臺灣大學<br>農藝學研究所<br>95<br>During the initial stages of experimentation, two-level regular fractional factorial designs (FFDs) are commonly used to identify important factors which may significantly affect the response(s) of the experiment. The homogeneity of variance is a basic assumption in the ANOVA for location effects. The design issue of optimal 2n-p regular FFDs based on the homogenous variance assumption has been studied extensively. However, when the variance of the response variable changes as some specific factors change from one setting to another, these factors affecting the variation of the response are called dispersion factors in this study. Interestingly, to the best of our knowledge, the issue addressing the minimum aberration designs for location effects in the presence of dispersion factors has not been found in the literature. In this study, we shall investigate the minimum aberration 2n-p regular FFDs under the assumption that there are some specific factors responsible for the dispersion of the response. The dispersion effects may violate the usual assumption of variance homogeneity in ANOVA. Therefore, the aberration criterion needs to be modified in order to discuss this issue. It is anticipated that the choice of minimum aberration designs may depend upon the prior information on the dispersion effects. Specific attention will first be given to the simplest situation that there is exactly one factor responsible for the dispersion effects. After a thorough investigation on this, we extend the results to the situation that two factors involve the dispersion effects.
APA, Harvard, Vancouver, ISO, and other styles
23

Hsiao, Ya-Chun. "Two-level Minimum Aberration Fractional Factorial Designs in the Presence of Dispersion Factors." 2007. http://www.cetd.com.tw/ec/thesisdetail.aspx?etdun=U0001-2407200716263800.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

CHANG, TSUNG-HSING, and 張宗興. "Specification and Impact Analysis of Minimum Short Circuit Current Level for 69KV Transmission System." Thesis, 1996. http://ndltd.ncl.edu.tw/handle/21702562335080058888.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Xie, Zong-Han, and 謝宗翰. "The Minimization of Average Power under the Minimum Execution Time in High-level Synthesis." Thesis, 2017. http://ndltd.ncl.edu.tw/handle/31455421018406169175.

Full text
Abstract:
碩士<br>中原大學<br>電子工程研究所<br>105<br>In the high-level Synthesis, operation scheduling is a very important part, and most conventional operation scheduling algorithms make trade-off between control steps and resources. We propose a method that utilizes the characteristics of operation delay to decrease average power under the minimum execution time. In the increasingly complex circuit design, low power is also an important design objective, because people rely heavily on portable electronic products in modern life. In this thesis, we propose an ILP (integer linear programming) formulation to model the problem of minimizing the average power under the minimum execution time in high-level synthesis. We use the integer linear programming (ILP) formulas to express the relationship between operation scheduling and operation delay selection in order to reduce the average power. By those formulations, we simultaneously maintain the minimum cycle time and decrease average power efficiently to obtain the best solution.
APA, Harvard, Vancouver, ISO, and other styles
26

Hsueh, Wei-Chun, and 薛頠浚. "Mining Cross-Level Association Rules with Multiple Minimum Supports within the Sold Periods of Products." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/54259651337661015509.

Full text
Abstract:
碩士<br>國立中央大學<br>工業管理研究所<br>92<br>Cross-level association rules mining with multiple minimum supports is an important generalization of the association rule mining problem. Instead of setting a single minimum support for all items, Liu et al. proposed a method, named MSApriori, to allow users for specifying multiple minimum supports to reflect the natures of the items. Because not all items are sold in a whole year, we should consider the transactions in the sold periods of items as we calculate the supports of items. Previous techniques for mining cross-level association rules with multiple minimum supports are most top-down, progressive depending method extended from Apriori algorithm, e.g. MMS_Cumulate. Previous approaches result in worse mining efficiency and incompleteness of mined rules. In this research, we propose a bottom-up, simultaneously merging method based on CL_FP-tree, called CL_FP-tree (MIS), to improve the mining efficiency and completeness of mining cross-level association rule with multiple minimum supports. We extend the procedure, which proposed by Alex H.W. Lin (2003) for supports counting, not only to count the supports for all items, but also to judge the sold periods of all items as the basis of support counting. CL_FP-tree (MIS) aims to reduce the number of database rescans for finding the cross-level information. We implement the CL_FP_tree (MIS) with real data and find the results that the efficiency of CL_FP_tree (MIS) is better than CL_Apriori(MIS). And the number of cross-level association rules found by CL_FP-tree (MIS) algorithm is more than CL_Apripri(MIS). Furthermore, we solve the problem that the number of 1-frequent items decreasing as long as the number of transactions increasing.
APA, Harvard, Vancouver, ISO, and other styles
27

Sheik, Hafsa. "The influence of a blood donors sitting position during time of waiting on the change of haemoglobin concentration during blood donation." Thesis, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-349266.

Full text
Abstract:
The routines for blood testing were changed during 2010 at the blood bank in UAS. At first, the blood test was taken before the donation and now it is taken after donation. Along with this, the blood bank increased the lowest level for allowance of blood donation with 10 g/L both for men and women. The level is now on 125 g/L and 135 g/L for women respectively men. After the increase, it was noticed that the amount of blood donors were deferred due to low Hb levels in creased. A study made in year 2013, investigated how much the Hb-levels actually was changed during a blood donation. It showed that it was lowered in means by 6 g/L and not 10 g/L as previously thought.The aim of this study was to see if the sitting position of the blood donor during waiting time and the supine position during the time of blood donation may had any effect on the difference of the Hb-level during the blood donation.Data from the 120 blood donors in the earlier study was collected. Hb values, before and after blood donation, were taken from the earlier study and registered times were taken from the database Prosang. The waiting time, time of blood donation and the difference of Hb-levels were calculated and correlated with Spearmanns correlation coefficient.The results did not show any correlation between the times and the difference in Hb-levels. One of the reasons may be that the blood donor physiology differ and thus the change in Hb-level can vary.
APA, Harvard, Vancouver, ISO, and other styles
28

Saxena, Manjula. "A study of factors influencing mastery of minimum levels of learning in cognitive areas at the primary level." Thesis, 2001. http://hdl.handle.net/2009/1942.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Karakaya, Fuat. "Automated exploration of the asic design space for minimum power-delay-area product at the register transfer level." 2004. http://etd.utk.edu/2004/KarakayaFuat.pdf.

Full text
Abstract:
Thesis (Ph. D.)--University of Tennessee, Knoxville, 2004.<br>Title from title page screen (viewed May 13, 2004). Thesis advisor: Donald W. Bouldin. Document formatted into pages (x, 102 p. : ill. (some col.)). Vita. Includes bibliographical references (p. 99-101).
APA, Harvard, Vancouver, ISO, and other styles
30

Yang, Wan-Chi, and 楊婉祺. "Multi-level Fuzzy Mining with Multiple Minimum Supports Association Rules and Support Tuning Mechanism for E-learning Materials Recommendation." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/hzx626.

Full text
Abstract:
碩士<br>國立虎尾科技大學<br>資訊管理研究所<br>96<br>In the field of data mining, association rules are used to analyze customers’ relations of the transaction database, and web usage mining is to visit the website of the user browsing logs. In the learning management system also contains a large number of learners browsing logs, so the use of association rules on e-Learning to explore the database to retrieve a learner''s behavior patterns. The association rules mining in the most common and widespread is the use of Apriori algorithms, some related researches applied about mining transaction database on multiple-level and quantitative association rules with multiple minimum supports. But their proposed algorithm based on Apriori algorithm that is an un-efficient on mining lower support threshold, long patterns and huge number of frequent patterns. So this thesis was to use P-tree and FP-tree like structure to propose a new MFMFP-tree and MFMQFP-Growth algorithms to mining multi-level fuzzy association rules from quantitative transactions with multiple minimum supports, and application on mining frequent patterns of learners’ behavior. Through learning management system to retrieve learning path on learner choose materials, to provide recommendation for next learning course. Finally, we also proposed support tuning mechanism under the new algorithms.
APA, Harvard, Vancouver, ISO, and other styles
31

Parayandeh, Amir. "System Level Energy Optimization Techniques for a Digital Load Supplied with a DC-DC Converter." Thesis, 2013. http://hdl.handle.net/1807/35923.

Full text
Abstract:
The demand to integrate more features has significantly increased the complexity and power consumption of smart portable devices. Therefore extending the battery life-time has become a major challenge and new approaches are required to decrease the power consumed from the source. Traditionally the focus has been on reducing the dynamic power consumption of the digital circuits used in these devices. However as process technologies scale, reducing the dynamic power has become less effective due to the increased impact of the leakage power. Alternatively, a more effective approach to minimize the power consumption is to continuously optimize the ratio of the dynamic and leakage power while delivering the required performance. This works presents a novel power-aware system for dynamic minimum power point tracking of digital loads in portable applications. The system integrates a dc-dc converter power-stage and the supplied digital circuit. The integrated dc-dc converter IC utilizes a mixed-signal current program mode (CPM) controller to regulate the supply voltage of the digital load IC. This embedded converter inherently measures the power consumption of the load in real-time, eliminating the need for additional power sensing circuitry. Based on the information available in the CPM controller, a minimum power point tracking (MiPPT) controller sets the supply and threshold voltages for the digital load to minimize its power consumption while maintaining a target frequency. The 10MHz mixed-signal CPM controlled dc-dc converter and the digital load are fabricated in 0.13µm IBM technology. Experimental results verify that the introduced system results in up to 30% lower power consumption from the battery source.
APA, Harvard, Vancouver, ISO, and other styles
32

Ju, Chen Ying, and 陳盈如. "A study of certification instrument for assessing the minimum competencies of home economics teachers at high school level- developing a paper-pencil test." Thesis, 1998. http://ndltd.ncl.edu.tw/handle/57423319786140846303.

Full text
Abstract:
碩士<br>國立師範大學<br>家政教育學系<br>86<br>In response to the new policy of teacher education, the study aims to identify the minimum home economics competencies of home economics teachers at high school level for developing and validating the assessment instrument to assess and certify home economics teachers. In the meanwhile, the instrument can be used in assessing the competencies of students at home economics teacher education program. It is expected that the study will be of value in upgrading the quality of home economics teachers and home economics teaching in general. The research methods of this study included documents analysis, panel discussion and questionnaire survey. First, through documents analysis and panel discussion to identify the minimum competencies of home economics teachers at high school level. In addition, a blueprint of "Paper-Pencil Test of Minimum Home Economics Competencies" was constructed by panel discussion. Second, the test was developed and submitted to an expert panel for testing content validity. And then the test was piloted and examined the reliability and validity. Finally, the passing score of the test for certification of minimum home economics competencies was proposed. Main conclusions of this study were as follows:1. The majority of home economics competencies included the knowledge of "Means of Home Economics", "Individual and Family", "Management of Consume and Resources", "Food ", "Clothes" and "Living".2. The "Paper-Pencil Test of Minimum Home Economics Competencies" is a test that has good reliability and validity.3. The instruments of teacher certification should included basic-skill test, subject-content test and basic-pedagogy test. In it, subject-content test could be developed by paper-pencil test.Suggestions for future study, test use, student teacher education and teacher employments of home economics were proposed.
APA, Harvard, Vancouver, ISO, and other styles
33

Rollo, Susan Noble. "Herd-level Risk Factors Associated with Antimicrobial Susceptibility Patterns and Distributions in Fecal Bacteria of Porcine Origin." Thesis, 2011. http://hdl.handle.net/1969.1/ETD-TAMU-2011-08-9755.

Full text
Abstract:
The purpose of this dissertation is threefold: to determine the differences in apparent prevalence and the antimicrobial susceptibility of Campylobacter spp. between antimicrobial-free and conventional swine farms; secondly, to introduce an appropriate statistical model to compare the minimum inhibitory concentration distributions of Escherichia coli and Campylobacter spp. isolated from both farm types; and thirdly, to examine the potential herd level risk factors that may be associated with antimicrobial resistance of Campylobacter spp. and E. coli isolates from finishers on antimicrobial-free and conventional farming systems. In addition, a critical review of studies that have compared the levels and patterns of antimicrobial resistance among animals from antimicrobial-free and conventional farming practices was performed. Fecal samples from 15 pigs were collected from each of 35 antimicrobial-free and 60 conventional farms in the Midwestern U.S. Campylobacter spp. was isolated from 464 of 1,422 fecal samples, and each isolate was tested for susceptibility to 6 antimicrobials. The apparent prevalence of Campylobacter spp. isolates was approximately 33 percent on both conventional and antimicrobial-free farms. The proportion of antimicrobial resistance among Campylobacter was higher for three antimicrobials within conventional compared to antimicrobial-free farms. The susceptibilities of populations of bacteria to antimicrobial drugs were summarized as minimum inhibitory concentration (MIC) frequency distributions. The use of MIC values removed the subjectivity associated with the choice of breakpoints which define an isolate as susceptible or resistant. A discrete-time survival analysis model was introduced as the recommended statistical model when MICs are the outcome. A questionnaire was completed by each farm manager on biosecurity, preventive medication, vaccines, disease history, and production management. Multivariable population-averaged statistical models were used to determine the relationships among antimicrobial susceptibility patterns and potential herd-level risk factors. Controlling for herd type (antimicrobial-free versus conventional), each antimicrobial-bacterial species combination yielded unique combinations of risk factors; however, housing type, history of rhinitis, farm ventilation, and history of swine flu were significant in more than one model. A variety of herd-level practices were associated with the prevalence of antimicrobial resistance on swine farms. Further studies are encouraged when considering interventions for antimicrobial resistance on both antimicrobial-free and conventional farms.
APA, Harvard, Vancouver, ISO, and other styles
34

Groenewald, Jakobus William. "Collective bargaining, minimum labour standards and regulated flexibility in the South African clothing manufacturing sector: at the level of the National Clothing Bargaining Council's Western Cape Sub-Chamber." Thesis, 2006. http://etd.uwc.ac.za/index.php?module=etd&action=viewtitle&id=gen8Srv25Nme4_5115_1228892816.

Full text
Abstract:
<p align="justify">In the context of a society in which there is an urgent need to create jobs, this research considers, firstly, whether the current labour regulatory environment is flexible enough to allow for an employment scenario that is conducive to job creation. The research then considers what is meant by the policy of &lsquo<br>regulated flexibility&rsquo<br>and considers how flexibility operates in practice at NBC level. It is argued that the concept of flexibility is a misnomer &ndash<br>since it creates more problems than it solves. The research concludes with a call for real flexibility that will allow for increased investment and a greater supply of jobs.</p>
APA, Harvard, Vancouver, ISO, and other styles
35

Lapa, Vânia Filipa Nunes. "Modelos de programação inteira em otimização financeira: construção de um índice de fundos." Master's thesis, 2017. http://hdl.handle.net/10316/84756.

Full text
Abstract:
Dissertação de Mestrado em Métodos Quantitativos em Finanças apresentada à Faculdade de Ciências e Tecnologia<br>Esta dissertação foca-se em problemas relacionados com a seleção de carteiras, nomeadamente a construção de um índice de fundos e a otimização de portefólios com um nível mínimo de transação. Num problema geral de otimização de portefólios, e com o intuito de obter melhores resultados, um agente económico que pretenda investir no mercado bolsista deverá acompanhar a evolução dos preços dos ativos, fazendo uma análise complexa e contínua. Uma forma de contornar esta situação seria investir num índice de fundos, isto é, numa carteira que siga um determinado índice de mercado tão perto quanto possível. Para o efeito, construiu-se um modelo de programação inteira que pode ser resolvido usando o método de Branch-and-Bound. Além disso, um dos problemas de seleção de carteiras mais importantes da literatura foi apresentado por Markowitz, cujo objetivo é obter uma carteira com o menor risco possível para um dado nível mínimo de retorno esperado. Contudo, as soluções obtidas através deste problema de otimização poderão representar portefólios inviáveis em termos práticos, uma vez que estes poderão incluir investimentos demasiado reduzidos em certos ativos e, consequentemente, os retornos obtidos podem não superar os custos de transação ou manutenção. Como tal, optou-se por utilizar um modelo que, apesar de não ser de programação inteira, pode ser resolvido utilizando a mesma técnica por forma a obter soluções que satisfaçam um nível mínimo de transação para cada ativo, no caso de a sua posição na carteira ser positiva.<br>This dissertation focuses on problems related to portfolio selection, namely the index fund construction and portfolio optimization with a minimum transaction level. In a general problem of portfolio optimization, an investor, who intends to invest in the stock market, must follow the asset price evolution to get better results, performing a complex and continuous analysis. To avoid this situation, he could invest in an index fund, which represents a portfolio that must represent the underlying stock index as closely as possible, in order to get similar returns. For this purpose, an integer programming model is constructed and this problem can be solved with the Branch-and-Bound method. In addition, one of the most important problems related to portfolio selection presented in the relevant literature was introduced by Markowitz, whose aim is to minimize the portfolio risk for a given level of expected minimum return. However, the solutions obtained through this optimization problem may represent portfolios that are not feasible on a practical level, since these portfolios might include reduced investments in certain assets and, consequently, the returns obtained may not exceed the associated costs, like transaction and maintenance costs. As a result, a model is chosen which, despite not being an integer programming problem, can be resolved with the same strategy in order to obtain solutions that satisfy the minimum transaction level for each asset if its position on the portfolio is positive.
APA, Harvard, Vancouver, ISO, and other styles
36

ŽÁČKOVÁ, Klára. "Sezónní dynamika vybraných krevních parametrů u vybraných masných plemen ovcí chovaných v podhorských podmínkách." Master's thesis, 2009. http://www.nusl.cz/ntk/nusl-51260.

Full text
Abstract:
Sheep breeding is nowadays a developing branch of agriculture again. There is a lot of different breed and they react distinctly on the same conditions of the enviroment. Sheep of breeds charollais, suffolk, šumavská ovce and valaška bred in similar conditions were observed in spring and autumn of years 2007 and 2008. In these seasons were taking blood samples (from {$\pm$}7{--}24) ewes and lambs and were analyzed in hematology laboratory. There were determined haemoglobin level, haematocrit indicator, erytrocytes and leucocytes levels, glucose, cholesterol, triglycerides levels, urea and plasmatic proteins, activity of ALP and GMT enzymes, phosphor, calcium, magnesium, zinc and copper levels. The main objective of this project was determine seasonal changes in observed parametres. Next objectives were determine different changes in blood parametres in different breeds and different aimes of breeds. There were recognized that all the observed breeds don`t react the same way on similar conditions. There were not provably determined seasonal changes in observed parametres, but average Hb level was higher in autumn than in spring. Urea level was conversely higher in spring season than in autumn. The demostrable fact is, that the similar conditions induce different answers not only in different breeds but also in different aimes of breeds.
APA, Harvard, Vancouver, ISO, and other styles
37

AMBROSI, AGNESE. "Le politiche pubbliche di lotta alla povertà: processi attuativi ed impatti dei nuovi schemi di reddito minimo in Emilia-Romagna. Sostegno per l’inclusione attiva (SIA), Reddito di inclusione (REI) e Reddito di solidarietà (RES)." Doctoral thesis, 2019. http://hdl.handle.net/11573/1231443.

Full text
Abstract:
La tesi analizza i processi di implementazione e gli impatti delle nuove politiche di contrasto alla povertà in Emilia Romagna (Sostegno per l'inclusione attiva - Sia; Reddito di inclusione - Rei; Reddito di solidarietà - Res). La ricerca è condotta tramite analisi documentale; analisi dei dati; osservazione sul campo e lavoro sociale; interviste non strutturate; sessanta studi di caso seguiti ognuno per un periodo non inferiore a sei mesi; un questionario di 70 domande inviato a 204 assistenti sociali sul territorio regionale; un questionario di 41 domande inviato a tutti gli ambiti distrettuali sociali dell'Emilia Romagna. A livello teorico fa riferimento agli studi sull'implementazione delle politiche pubbliche e sulla street level bureaucracy.
APA, Harvard, Vancouver, ISO, and other styles
38

Земелько, Аліна Федорівна. "Теоретико-методичні підходи до обліку оплати праці та аналіз використання трудових ресурсів на ТОВ «Полтавка-1»". Магістерська робота, 2020. https://dspace.znu.edu.ua/jspui/handle/12345/2420.

Full text
Abstract:
Земелько А. Ф. Теоретико-методичні підходи до обліку оплати праці та аналіз використання трудових ресурсів на ТОВ «Полтавка-1» : кваліфікаційна робота магістра спеціальності 071 "Облік і оподаткування" / наук. керівник В. В. Сьомченко. Запоріжжя : ЗНУ, 2020. 117 с.<br>UA : Кваліфікаційна робота: 117 с., 21 табл., 12 рис., 68 літературних джерел. Метою кваліфікаційної роботи є дослідження особливостей відображення в обліку та проведення комплексного аналізу господарських операцій пов’язаних з обліком оплати праці та ефективності використання трудових ресурсів на ТОВ «Полтавка-1». Об’єктом дослідження – є процес обліку оплати праці та аналізу ефективності використання трудових ресурсів ТОВ «Полтавка-1». У процесі написання кваліфікаційної роботи застосовувалися наступні методи дослідження: порівняльно-правовий; соціологічний; системно-структурний; статистичний; елементи, методу бухгалтерського обліку (рахунки, подвійний запис, документація, балансове узагальнення і звітність). Наукова новизна та практичне значення одержаних результатів дослідження полягає в теоретичному обґрунтуванні й розробленні організаційно-методичних і практичних рекомендацій з удосконалення бухгалтерського обліку та аналізу розрахунків з оплати праці, використання яких призведе до поліпшення обліково-аналітичного забезпечення діяльності агропромислового сектору, а саме: – уточнено дефініцію «заробітна плата», під якою слід розуміти як встановлену за трудовим договором ставку заробітної плати найманому робітнику, рівень якої визначається умовами ринку праці: вартістю робочої сили, умовами найму, попитом, пропозицією і конкуренцією; рівнем кваліфікації і спеціалізації працівника; системами стимулювання й успішністю самої праці, що на відміну від існуючих трактувань ґрунтується на принципі системності та поєднанні різних підходів до її визначення; – запропоновано впровадити на підприємстві Графік документообігу з обліку оплати праці який сприятиме покращенню всієї роботи на підприємстві та посилить контрольні функції обліку; – запропоновано внесення уточнень, додавань та вилучень окремих рядків та граф у ф. 67 Б «Обліковий лист тракториста-машиніста», що дасть можливість: отримати інформацію щодо структурного підрозділу, що видав документ; здійснювати ведення аналітичного обліку витрат не тільки за структурними підрозділами, а й за центрами відповідальності, а також в розрізі аналітичних субрахунків; вести облік витрат праці та її оплати не тільки за видами культур, а й за окремими сортами; здійснювати оперативний контроль за відхиленнями від норм виробітку, та з’ясувати конкретні причини відхилень.<br>EN : Qualifying work: 117 p., 12 fig., 21 tabl., 68 references. The purpose of the qualification work is to study the features of accounting and comprehensive analysis of economic transactions related to payroll accounting and the efficiency of use of labor resources at LLC «Poltavka-1». The object of the study is the process of accounting for pay and analysis of the efficiency of labor use of LLC «Poltavka-1». The following research methods were used in the course of writing the qualification: comparative legal; sociological; system-structural; statistical; elements, methods of accounting (accounts, double entry, documentation, balance sheet and reporting). The scientific novelty and practical significance of the results of the research is the theoretical substantiation and development of organizational, methodological and practical recommendations for improving accounting and analysis of payroll, which use will lead to improvement of accounting and analytical support of the agro-industrial sector, namely: – the definition of «wages» is defined, which should be understood as the wage rate fixed by the labor contract, the level of which is determined by labor market conditions: labor cost, terms of hiring, demand, supply and competition; the level of qualification and specialization of the employee; incentive systems and the success of the work itself, which, unlike existing interpretations, is based on the principle of systematicity and the combination of different approaches to its definition; – it is proposed to introduce at the company the Schedule of workflow accounting which will help to improve all work at the enterprise and strengthen the control functions of accounting; – it is suggested to make the clarifications, additions and deletions of separate lines and graphs in f. 67 B «Tractor-driver's account», enabling: to obtain information about the structural unit that issued the document; to conduct analytical accounting of expenses not only by structural divisions, but also by responsibility centers, and also in the context of analytical subaccounts; to keep records of labor costs and their payment not only by types of crops, but also by individual varieties; to carry out operational control over deviations from working norms, and to find out specific reasons of deviations.
APA, Harvard, Vancouver, ISO, and other styles
39

Pitacas, João. "Modelo Operacional dos Corpos de Bombeiros à Escala Intermunicipal." Master's thesis, 2021. http://hdl.handle.net/10400.26/35505.

Full text
Abstract:
Os Corpos de Bombeiros (CB) contam com uma rede de quartéis implantada em todo o território nacional, atualmente organizada por um modelo baseado nos limites territoriais das NUTS (Nomenclatura das Unidades Territoriais para Fins Estatísticos). Será, portanto, possível rentabilizar a rede de CB no patamar sub-regional, através da implementação de critérios de desempenho dos CB, entre eles, a população abrangida dentro dos tempos de resposta convencionados. O objetivo deste trabalho é propor uma reorganização operacional da rede de quartéis já implantada nas Sub-Regiões da Lezíria do Tejo e Médio Tejo, tendo como base um modelo da Rede Principal dos Serviços Operacionais dos Corpos de Bombeiros em Portugal Continental. Para tal, definiram-se critérios com vista à constituição de Agrupamentos de CB, que partilhando entre si áreas de atuação, permite priorizar o despacho de meios baseado no tempo de resposta dentro dos limites dos Agrupamentos. A aplicação dos critérios contemplando a rede viária existente e a distribuição da população residente, permitiu, recorrendo ao software QGIS®, aferir as zonas com necessidade de reforço da rede de quartéis. Aplicado o modelo à área de estudo, verificou-se um aumento de 18,4% (1.401 km2) de área e de 6,1% da população (30.524 habitantes) abrangidas dentro dos tempos de referência (10 e 20 minutos). Para garantir a atividade operacional nos 24 municípios abrangidos, a rede de CB contaria 24 quartéis sede e 22 postos avançados (reforço da atual rede com 8), guarnecidos no mínimo por um efetivo total de 1.897 bombeiros profissionais. O facto da rede de quartéis já se encontrar implantada no território alvo de estudo e apenas necessitar de reforços pontuais, deveria ser atualmente um fator desencadeador de interesse na sua rentabilização por parte das várias entidades envolvidas.<br>Fire Departments (CB) have a network of barracks deployed throughout the national territory, currently organized by a model based on the territorial limits of the NUTS (Nomenclature of Territorial Units for Statistical Purposes). Therefore, it will be possible to improve the fire department network at the sub-regional level, through the implementation of fire departments performance criteria, including the population coverage within the settled response times. The objective of this work is to propose an operational reorganization of the barracks network already implemented in the Sub-Regions of Lezíria do Tejo and Médio Tejo, based on a model of the Main Network of Operational Services for Fire Brigades in Mainland Portugal. To this end, criteria were defined with a view to the constitution of CB Groups, which sharing areas of activity among themselves, allows prioritizing the dispatch of means based on the response time within the limits of the Groups. The application of the criteria covering the existing road network and the distribution of the resident population, allowed, using the QGIS® software, to assess the areas in need of reinforcement of the barracks network. Applying the model to the Lezíria do Tejo and Médio Tejo sub-regions, there was an increase of 18,4% (1.401 km2) in area and 6,1% of the population (30.524 inhabitants) covered within the reference times (10 and 20 minutes). In order to guarantee operational activity in the 24 municipalities covered, the fire departments network would consist of a total of 24 headquarters and 22 outposts with a minimum of 1.897 professional firefighters. The fact that the barracks network is already implanted in the study target territory and only needs occasional reinforcements, should be a triggering factor of interest in its profitability by the various entities involved.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!