To see the other types of publications on this topic, follow the link: The evaluated methods.

Dissertations / Theses on the topic 'The evaluated methods'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'The evaluated methods.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Karlsson, Rasmus, and Alvar Sveninge. "Virtual Reality Locomotion : Four Evaluated Locomotion Methods." Thesis, Högskolan Väst, Avd för informatik, 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:hv:diva-11651.

Full text
Abstract:
Virtual Reality consumer hardware is now available for the masses through the HTC Vive, Oculus Rift and PlayStation VR. Locomotion or virtual travel inside immersive experiences is an area which is yet to be fully solved due to space constraints, problems with retaining immersion and potential sickness. This thesis had the goal of evaluating user preferences for four locomotion methods in Virtual Reality with a first generation HTC Vive through the gaming platform Steam.  The theoretical framework provides an elementary understanding of the field of Virtual Reality and how humans interact and get affected by locomotion in that context. To contextualize the experience of evaluating the locomotion systems the Hedonic-Motivation System Adoption Model is used as it covers intrinsic motivation which is common in video games, social networking and virtual worlds.  An extensive process for games selection has been performed which has resulted in four locomotion methods with four games per method. Sixteen participants got to test one locomotion method each where their gameplay got recorded for later observation. After each game session answers were provided by the participants based on surveys and after completion of all games a questionnaire gauged their sickness level.  The conclusion proved inconclusive. While the results without interpretation showed the locomotion method Artificial as the overall winner a range of potential problems were found with the study in general. Some problems included observations which did not provide the expected results, introducing doubt into either how the study was conducted or the reliability of certain users. A larger sampler size along with a better study procedure could possibly have provided a more conclusive answer.
APA, Harvard, Vancouver, ISO, and other styles
2

Brinkman, Jacoline Willijanne. "Albuminuria as a laboratory risk marker methods evaluated /." [S.l. : Groningen : s.n. ; University Library Groningen] [Host], 2007. http://irs.ub.rug.nl/ppn/304605956.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Richman, Judah Lee. "Accelerated methods of residential construction : prefabrication re-evaluated." Thesis, Massachusetts Institute of Technology, 1994. http://hdl.handle.net/1721.1/12310.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Rinne, Teemu. "Human cortical functions in auditory change detection evaluated with multiple brain research methods." Helsinki : University of Helsinki, 2001. http://ethesis.helsinki.fi/julkaisut/hum/psyko/vk/rinne/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Yue, Warren. "Peak power-handling capacity of finline structures evaluated by numerical and experimental methods." Thesis, University of Ottawa (Canada), 1986. http://hdl.handle.net/10393/5548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bati, Nabil A. "Thermal, oxidative and hydrolytic stability of selected frying shortenings evaluated by new and conventional methods." Diss., Virginia Polytechnic Institute and State University, 1989. http://hdl.handle.net/10919/54488.

Full text
Abstract:
The thermal, oxidative, and hydrolytic stability of several frying shortenings were studied via chemical, physical and sensory analyses. Corn, cottonseed and peanut oils, and cottonseed and soybean liquid shortenings were tested under static heating conditions, while peanut oil, and cottonseed and soybean oil liquid shortenings were evaluated under commercial frying conditions. The research had two objectives: to evaluate the relative stability of the various shortenings under both heating condition; and to evaluate new or modified quality assessment methods which would provide early prediction of heat abuse for the fast-food industry. Six of the conducted analyses were conventional or modified: free fatty acids; polar components; gas chromatograph volatile profiles; viscosity; FoodOil-Sensor; and sensory. Three were new: contact angle; high temperature; and high-temperature gas chromatographic analysis of triglyceride; and polar component % as determined by high-performance thin-layer chromatography (HPTLC). Under static heating conditions, varying heating periods or shortening types had significant (P<0.000l) effects on the resulting data of the following tests: free fatty acids; polar component; total volatiles; dielectric constant; viscosity; polar component % measured by HPTLC; contact angle; and sensory analysis; but heating time had no significant effect on triglyceride profiles Under commercial frying conditions of chicken nuggets and filets, heating time had significant effects on changes in the dielectric constant; free fatty acid %; viscosity; contact angle; and sensory rating; also it had a significant effect on the polar component % under chicken nugget frying conditions only. Furthermore, heating time had no significant effect on polar component % under chicken filet frying conditions and on polar component % by HPTLC under both frying conditions Cottonseed oil liquid shortening had sensory scores equal to peanut oil under static and commercial frying conditions even though peanut oil exhibited a greater chemical and physical stability. Soybean oil liquid shortening had an objective quality identical to peanut oil, however, its subjective quality was lower. Cottonseed oil liquid shortening had better flavor but less objective stability than soybean oil liquid shortening The cut-off quality level for the shortenings was not reached, because all the shortenings were discarded after seven days of use which was before the onset of significant-quality deterioration. The best on-site index of shortening stability was the FoodOil-Sensor reading (dielectric constant) which was followed by the free fatty acid test.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
7

Mosoke, Eko victor. "Four Wastewater Treatment Methods Evaluated from a Sustainability Perspective in the Limbe Urban Municipality Cameroon (Central Africa)." Thesis, Mittuniversitetet, Institutionen för teknik och hållbar utveckling, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-18434.

Full text
Abstract:
Aggravated by rapid population growth, urbanization, and industrialization and most recently by climate change events, the availability of water especially in the third world is reaching critical proportions.  This is aggravated by the non treatment of wastewater (sewage) and discharged of untreated wastewater into water bodies. The study focused on identifying and reviewing four wastewater treatment methods from a sustainability perspective; waste stabilization pond, constructed wetland, up-flow anaerobic sludge blanket reactor and sedimentation/thickening tank systems suitable for the Limbe Urban Municipality (LUM) of Cameroon in Central West Africa with an estimated population of 120, 000 inhabitants and experiencing 4.7 per cent annual growth rate. The attractiveness of these four methods stems from their apparent energy efficiency, simplicity, robustness, low cost effectiveness in situations where as in the LUM, there are huge tracts of available land, warm temperatures, and their capacity to promote effluent re-use opportunities for various sectors. Issues of sustainability of the water supply and wastewater treatment systems, untreated sewage, and their contribution to escalating environmental and public health impacts in LUM (Cameroon) were critically evaluated and discussed with the aid of Kärrman (2000) framework approach that employs different sets of sustainability criteria (Environmental, Health and Hygiene, and Functional), sub-criteria and indicators. Results obtained reveal that water and wastewater treatment systems in LUM do not operate or conforms to sustainability perspectives. Inhabitants do not still have access to clean drinking water (an approximate 45 per cent) especially in the dry periods of the year, low sanitation coverage (with the tradition of sewage treatment in septic tanks and pit latrines), rising yearly public health impacts associated with water-borne (cholera, dysentery, malaria, typhoid fever and diarrhea) infections and 6 deaths reported in LUM. These leading problems are directly or indirectly linked to consumption of contaminated water or foods in different communities such as Mile II, Isokolo, Bonadikombo (Mile four) etc, and New Town areas and flood prone zones in the Limbe urban municipality.
APA, Harvard, Vancouver, ISO, and other styles
8

Fotherby, Martin D. "Non-pharmacological methods of blood pressure reduction in elderly hypertensives evaluated by 24-hour ambulatory BP monitoring." Thesis, University of Leicester, 1995. http://hdl.handle.net/2381/34347.

Full text
Abstract:
This Thesis examines the effects of non-pharmacological methods on lowering blood pressure and the potential mechanisms for their action in elderly hypertensive and normotensive persons. Changes in blood pressure following these interventions were evaluated by conventional clinic blood pressure measurements and 24-hour ambulatory blood pressure monitoring. The reproducibility of individual 24-hour ambulatory blood pressure measurements in elderly subjects was shown to be greater than clinic measurements, enabling smaller blood pressure changes to be detected in a given number of subjects; other advantages are the ability to assess blood pressure changes over the 24 hour period and the lack of observer bias and placebo effect. Moderate restriction of dietary sodium intake (from 174 to 95 mmol/24 hour) resulted in a fall in clinic systolic blood pressure only, while a moderate increase in potassium intake using diet supplements produced falls in clinic systolic and diastolic blood pressure and also in 24-hour ambulatory systolic blood pressure. Sustained caffeine use was found to have no significant effect on the clinic or ambulatory blood pressure levels. The substitution of non-pharmacological methods including reduction of weight and sodium intake and increases in dietary potassium intake following withdrawal of anti-hypertensive drug therapy in elderly hypertensive patients with controlled blood pressure allowed 20% of such patients to remain normotensive off medication for over 1 year. The main limitations on the replacement of anti-hypertensive drugs with non-pharmacological therapies was the high prevalence of poorly controlled blood pressure levels in currently treated elderly hypertensives. The routine use of non-pharmacological methods by general practitioners to lower high blood pressure in elderly hypertensive patients was found to be limited, only a minority use such methods as first line treatment. In conclusion, significant reductions in blood pressure with certain non- pharmacological methods have been observed in some elderly hypertensive persons. However it appears that non-pharmacological therapy will need to be combined with drug therapy to achieve satisfactory blood pressure control in many elderly hypertensive subjects.
APA, Harvard, Vancouver, ISO, and other styles
9

Brockbank, Sarah Ann. "Aqueous Henry's Law Constants, Infinite Dilution Activity Coefficients, and Water Solubility: Critically Evaluated Database, Experimental Analysis, and Prediction Methods." BYU ScholarsArchive, 2013. https://scholarsarchive.byu.edu/etd/3691.

Full text
Abstract:
A database containing Henry's law constants, infinite dilution activity coefficients and solubility data of industrially important chemicals in aqueous systems has been compiled. These properties are important in predicting the fate and transport of chemicals in the environment. The structure of this database is compatible with the existing DIPPR® 801 database and DIADEM interface, and the compounds included are a subset of the compounds found in the DIPPR® 801 database. Thermodynamic relationships, chemical family trends, and predicted values were carefully considered when designating recommended values. Henry's law constants and infinite dilution activity coefficients were measured for toluene, 1-butanol, anisole, 1,2-difluorobenzene, 4-bromotoluene, 1,2,3-trichlorobenzene, and 2,4-dichlorotoluene in water using the inert gas stripping method at ambient pressure (approximately 12.5 psia) and at temperatures between 8°C and 50°C. Fugacity ratios, required to determine infinite dilution activity coefficients for the solid solutes, were calculated from literature values for the heat of fusion and the liquid and solid heat capacities. Chemicals were chosen based on missing or conflicting data from the literature. A first-order temperature-dependent group contribution method was developed to predict Henry's law constants of hydrocarbons, alcohols, ketones, and formats where none of the functional groups are attached directly to a benzene ring. Efforts to expand this method to include ester and ether groups were unsuccessful. Second-order groups were developed at a reference condition of 298.15 K and 100 kPa. A second-order temperature-dependent group contribution method was then developed for hydrocarbons, ketones, esters, ethers, and alcohols. These methods were compared to existing literature prediction methods.
APA, Harvard, Vancouver, ISO, and other styles
10

Størdal, Ingvild Fladvad. "Induction of CYP 1A enzyme activity and genotoxicity from ternary mixtures of produced water relvant compounds, evaluated by in vitro methods." Thesis, Norges teknisk-naturvitenskapelige universitet, Institutt for biologi, 2011. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-12757.

Full text
Abstract:
Produced water is a complex mixture discharged to sea in high volumes containing compounds at low concentrations. Compounds in mixtures can modify each other’s expected toxic effect predicted from single exposure, and to obtain information about potential interactions is important. Carbazole is present in produced water and is suggested to contribute to produced waters potential of modifying cytochrome P450 (CYP) activity. Information on toxic effect of carbazole in relation to produced water is limited. Carbazole is included in this project to study its potential for modifying CYP 1A activity singly and in mixtures. The aromatic and phenolic fractions of produced water are significant contributors to toxicity and concentration of organic compounds in produced water. As representative compounds of these fractions, benzo(a)pyrene (BaP) and 2,5-dimethylphenol (DMP) are included in this master project. Biotransformation of harmful compounds is often initiated by CYPs catalysing oxidation reactions. Modified CYP 1A protein or activity is an indication of interaction between a compound and the biological system. Biotransformation catalysed by CYP 1A can produce reactive oxygen species (ROS) and reactive metabolites. Increased contaminant load can deplete reduced glutathione (GSH) through increase in conjugation reactions. In addition to being an important conjugate, GSH is also an important antioxidant. With potential both to increase extent of DNA damage by increasing levels of ROS and producing reactive metabolites, and interfere with glutathione dependent defence protecting against oxidative stress, compounds interacting with CYP 1A and conjugation enzymes are suggested to contribute to DNA damage. The aims of this master project were to determine carbazoles potential for modifying CYP 1A activity measured as ethoxyresorufin-O-deethylase (EROD) activity, to evaluate interaction in ternary mixtures with carbazole and two compounds representing fractions contributing to toxicity of produced water, and also to study correlation between biotransformation activity and genotoxicity by measuring DNA double strand breaks (DSB). The aims were achieved by studying single compounds and ternary mixture in vitro in the continuous cell line PLHC-1. Concentrations of the three compounds in the ternary mixtures were varied using a statistical design. Results were analysed using partial least squares projection to latent structures (PLS). Concentrations of the compounds included in the design were determined from cytotoxicity results, EROD concentration effect curves for single compounds and concentrations measured in the marine environment. It was hypothesized that compounds will modify EROD activity in PLHC-1 differently when present in ternary mixtures compared to single exposure. It was further suggested that ternary mixtures inducing high EROD activity also will enhance formation of DNA DSB in PLHC-1 cells. Carbazole was suggested to modify EROD activity induced by the other compounds. Exposure of PLHC-1 to BaP singly significant induced EROD activity. Carbazole induced EROD activity slightly, and significant for the highest concentration in one replicate. An overall non-significant decrease in EROD activity was seen in PLHC-1 exposed to DMP. Exposure of PLHC-1 to ternary mixtures resulted in significant and positive PLS regression coefficient for both BaP and carbazole. The crossed term carbazole×DMP decreased EROD activity significantly. Squared terms for all three compounds were significant, equal and negative.The results indicate that potential of carbazole at inducting CYP 1A is different alone compared to when it is present in mixture. Carbazole is suggested to contribute to EROD inducing potency in the ternary mixtures. Effect of carbazole on EROD activity is indicated to be dependent on co-exposed compounds. How EROD activity is modulated when exposed to DMP is suggested to be dependent on exposure conditions. The significant substantial square terms indicate that all three compounds interact with catalytic EROD activity at higher concentrations. Catalytic EROD activity is presumed to be a good indication of potential interaction between compounds and biological systems. Determining extent of DNA damage by electrophoretic separation of DNA did not give consistent result. Electrophoretic separation of DNA from PLHC-1 is assumed to be more pertinent for determining genotoxicity of certain metals in PLHC-1. Statistical design and projection techniques are considered valuable tools when assessing toxicity of mixtures.
APA, Harvard, Vancouver, ISO, and other styles
11

Dwyer, Thomas Patrick. "An investigation into improving the repeatability of steady-state measurements from nonlinear systems : methods for measuring repeatable data from steady-state engine tests were evaluated : a comprehensive and novel approach to acquiring high quality steady-state emissions data was developed." Thesis, University of Bradford, 2014. http://hdl.handle.net/10454/13824.

Full text
Abstract:
The calibration of modern internal combustion engines requires ever improving measurement data quality such that they comply with increasingly stringent emissions legislation. This study establishes methodology and a software tool to improve the quality of steady-state emissions measurements from engine dynamometer tests. Literature shows state of the art instrumentation are necessary to monitor the cycle-by-cycle variations that significantly alter emissions measurements. Test methodologies that consider emissions formation mechanisms invariably focus on thermal transients and preconditioning of internal surfaces. This work sought data quality improvements using three principle approaches. An adapted steady-state identifier to more reliably indicate when the test conditions reached steady-state; engine preconditioning to reduce the influence of the prior day’s operating conditions on the measurements; and test point ordering to reduce measurement deviation. Selection of an improved steady-state indicator was identified using correlations in test data. It was shown by repeating forty steady-state test points that a more robust steady-state indicator has the potential to reduce the measurement deviation of particulate number by 6%, unburned hydrocarbons by 24%, carbon monoxide by 10% and oxides of nitrogen by 29%. The variation of emissions measurements from those normally observed at a repeat baseline test point were significantly influenced by varying the preconditioning power. Preconditioning at the baseline operating condition converged emissions measurements with the mean of those typically observed. Changing the sequence of steady-state test points caused significant differences in the measured engine performance. Examining the causes of measurement deviation allowed an optimised test point sequencing method to be developed. A 30% reduction in measurement deviation of a targeted engine response (particulate number emissions) was obtained using the developed test methodology. This was achieved by selecting an appropriate steady-state indicator and sequencing test points. The benefits of preconditioning were deemed short-lived and impractical to apply in every-day engine testing although the principles were considered when developing the sequencing methodology.
APA, Harvard, Vancouver, ISO, and other styles
12

Dwyer, Thomas P. "An Investigation into Improving the Repeatability of Steady- State Measurements from Nonlinear Systems. Methods for measuring repeatable data from steady-state engine tests were evaluated. A comprehensive and novel approach to acquiring high quality steady-state emissions data was developed." Thesis, University of Bradford, 2014. http://hdl.handle.net/10454/13824.

Full text
Abstract:
The calibration of modern internal combustion engines requires ever improving measurement data quality such that they comply with increasingly stringent emissions legislation. This study establishes methodology and a software tool to improve the quality of steady-state emissions measurements from engine dynamometer tests. Literature shows state of the art instrumentation are necessary to monitor the cycle-by-cycle variations that significantly alter emissions measurements. Test methodologies that consider emissions formation mechanisms invariably focus on thermal transients and preconditioning of internal surfaces. This work sought data quality improvements using three principle approaches. An adapted steady-state identifier to more reliably indicate when the test conditions reached steady-state; engine preconditioning to reduce the influence of the prior day’s operating conditions on the measurements; and test point ordering to reduce measurement deviation. Selection of an improved steady-state indicator was identified using correlations in test data. It was shown by repeating forty steady-state test points that a more robust steady-state indicator has the potential to reduce the measurement deviation of particulate number by 6%, unburned hydrocarbons by 24%, carbon monoxide by 10% and oxides of nitrogen by 29%. The variation of emissions measurements from those normally observed at a repeat baseline test point were significantly influenced by varying the preconditioning power. Preconditioning at the baseline operating condition converged emissions measurements with the mean of those typically observed. Changing the sequence of steady-state test points caused significant differences in the measured engine performance. Examining the causes of measurement deviation allowed an optimised test point sequencing method to be developed. A 30% reduction in measurement deviation of a targeted engine response (particulate number emissions) was obtained using the developed test methodology. This was achieved by selecting an appropriate steady-state indicator and sequencing test points. The benefits of preconditioning were deemed short-lived and impractical to apply in every-day engine testing although the principles were considered when developing the sequencing methodology.
APA, Harvard, Vancouver, ISO, and other styles
13

Laitala, Christer. "Evaluate methods for managing distributed source changes." Thesis, Blekinge Tekniska Högskola, Avdelningen för för interaktion och systemdesign, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4624.

Full text
Abstract:
In larger developments, a use of configuration management is crucial; the company UIQ Technology is no exception. The configuration management method controls how the flow within a software developer should go, so the configuration management method and code complexity is something that works affects each other. Therefore, you might be able to combine multiple configuration methods to try to use the best from each method to decrease code complexity. That is the goal of this thesis, to see if the COTS, Single repository or Component Based could combine with the UIQ method to decrease code complexity. This have been tested through theoretical use cases for each method and the conclusion of the study is that, Single repository and Component Based works best in the UIQ method. However, COTS is not suited for the UIQ method because of the need of secrecy for large parts of the UIQ platform. UIQ wants to do as much as possible in-house, rather than give it out to other third-party companies that they are not in absolute need of. Some improvements have been achieved throughout Single repository, that the other third-parties companies needs to be upto- date before starting development, that is something that have not been valued before.
APA, Harvard, Vancouver, ISO, and other styles
14

Křížovská, Eliška. "Stanovení ceny stavebního podniku střední velikosti." Master's thesis, Vysoké učení technické v Brně. Ústav soudního inženýrství, 2011. http://www.nusl.cz/ntk/nusl-232547.

Full text
Abstract:
The subject of the diploma thesis is the price assessment of middle – size contruction company. It is concerned to the certain firm with the factual data, whose evaluation is practised by means of two methods. The choice of these methods was made with regard to the situation on the market and the calculation is made in the practical part of the diploma thesis.
APA, Harvard, Vancouver, ISO, and other styles
15

Rahman, Mashuqur, Ulf Håkansson, and Johan Wiklund. "Grout pump characteristics evaluated with the UVP+PD method." KTH, Jord- och bergmekanik, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-107161.

Full text
Abstract:
Rock grouting is performed to decrease the hydraulic conductivity around underground structures, such as tunnels and caverns. Cement grouts are often used and pumped into joint and fractures of the rock formation. Piston type pumps are mostly used for high pressure rock grouting. A pulsation effect is inevitable when using this type of pump due to the movement of the piston. The effect of this pulsation on rock grouting is yet to be known but believed to be benefi-cial for the penetration of the grout. Current flow meters used in the field are not accu-rate enough to determine the fluctuation of the flow rate when it is less than 1 l/min. In addition, currently available flow meters measure the average of the flow over a cer-tain period of time, hence the true fluctuation of the flow rate due to the pulsation of the piston remains unknown. In this paper, a new methodology, the so called ‘Ultrasound Velocity Profiling – Pressure Difference’ (UVP+PD) method has been introduced to show the pulsation effect when using a piston type pump. The feasibility of this method was successfully investigated for the direct in-line determination of the rheological properties of micro cement based grouts under field conditions (Rahman & Håkansson, 2011). Subse-quently, it was also found that this method can be very efficient to measure the fluctu-ation of the flow rate for different types of pumps. From a grouting point of view the UVP+PD method can be used to synchronize the pressure and flow of a piston type pump by measuring the pulsation effect. Conse-quently it can be used as a tool for the efficiency and quality control of different types of pumps.

QC 20121221

APA, Harvard, Vancouver, ISO, and other styles
16

Burrows, Timothy. "A Preliminary Rubric Design to Evaluate Mixed Methods Research." Diss., Virginia Tech, 2013. http://hdl.handle.net/10919/19324.

Full text
Abstract:
With the increase in frequency of the use of mixed methods, both in research publications and in externally funded grants there are increasing calls for a set of standards to assess the quality of mixed methods research. The purpose of this mixed methods study was to conduct a multi-phase analysis to create a preliminary rubric to evaluate mixed methods research articles. This study included four research questions:
1. What are the common evaluation criteria found in the contemporary methodological literature pertaining to the design of mixed methods research?
2. What evaluation criteria do experts in the field of mixed methods research perceive as the most important when distinguishing top-quality research in mixed methods?
3. What differences are there in the outcome of the rubric for evaluating mixed methods research identified from the literature compared to those advocated most uniformly by a panel of mixed methods research experts?
4. What are disciplinary differences between the use of mixed methods and views about evaluating it, including the role of paradigms in mixed methods research?
    In the first phase of this multi-phase mixed methods study I used an inductive qualitative process to identify the quality criteria endorsed by 12 methodologists with a long-term involvement in mixed methods research. In the second phase of this study I conducted a quantitative analysis to pilot test a set of criteria identified in the qualitative phases. The sample for both phases of this study was comprised of the same eight males  
and four females from multiple nationalities. Respondents to the on-line survey rated all 14 items as being important, with 11 of the 14 items being rated as very important or higher.
    When considered together, findings from the two phases of this study provide a interesting view of attitudes about the use and application of quality standards to the mixed methods literature. While there was agreement about what elements were important to evaluate, there was not an agreement about the idea that one set of standards could be applied to all mixed methods studies.

Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
17

Arts, Daniëlle Geertruida Theodora. "Information for intensive care evaluation methods to assess and improve data quality and data processing /." [S.l. : Amsterdam : s.n.] ; Universiteit van Amsterdam [Host], 2005. http://dare.uva.nl/document/79006.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Pak, Brian H. "Methods to evaluate the emotional response to U.S. antiterrorism measures." [Gainesville, Fla.] : University of Florida, 2005. http://purl.fcla.edu/fcla/etd/UFE0013287.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Zarour, Mohammad. "Methods to evaluate lightweight software process assessment methods based on evaluation theory and engineering design principles." Mémoire, École de technologie supérieure, 2009. http://espace.etsmtl.ca/92/1/ZAROUR_Mohammad.pdf.

Full text
Abstract:
Realiser un processus de developpement logiciel mature est devenu indispensable pour de nombreuses organisations de developpement de logiciels. Un processus mature de developpement de logiciels permet aux organisations de fournir a leurs clients des produits logiciels de haute qualite, livres en temps et selon les budgets prevus. Les organisations de developpement de logiciels ont lutte pendant des decennies pour ameliorer la qualite de leurs produits en ameliorant leurs processus de developpement logiciel. La conception d'un programme d'amelioration du processus de developpement logiciel est exigeante et complexe. Un programme d'amelioration comprend deux processus principaux: le processus d'evaluation et le processus d'amelioration. Le succes du programme d'amelioration exige d'abord une evaluation reussie; le fait ne pas evaluer le processus de developpement logiciel de I'organisation peut occasionner des resultats insatisfaisants. L'evaluation des processus logiciels peut soit etre utilise pour determiner la capacite d'une autre organisation, par exemple un sous-traitant, ou de determiner et de comprendre le statut de I'actuel processus de I'organisation pour engager un processus d'amelioration. Le nombre croissant de processus d'evaluation disponibles, la norme ISO 15504 qui definit les exigences relatives a l'evaluation des processus et la popularite du modele CMMI, illustrent la pertinence de revaluation des processus logiciels pour I'industrie du developpement de logiciels. Aujourd'hui, plusieurs methodes sont disponibles pour evaluer la maturite et la capacite des processus de developpement de logiciels. Ces methodes sont basees sur des processus d'evaluation et d'amelioration de cadres de references bien connus, tels que le CMMI et ISO- 15504. Le succes de ces methodes d'evaluation et I'amelioration des cadres de reference est soutenu par des etudes post-developpement sur la validite, la fiabilite et I'efficacite de ces methodes. Malheureusement, de nombreux chercheurs constatent que de telles methodes sont a trop grande echelle pour etre mises en ceuvre dans les TPE (Tres Petites Entreprises). En consequence, quelques chercheurs ont etudie le processus d'evaluation et d'amelioration dans les TPE et ont propose aux organismes des methodes d'evaluation generalement appelees "methodes SPA legeres", plus adaptee aux besoins de ces organisations TPE. Les recherches actuelles dans le domaine des SPA mettent I'accent sur des propositions de methodes d'evaluation faciles a utiliser, mais sans rechercher dans quelle mesure la conception de ces methodes est liee a la vision de I'ingenierie de conception. Cet alignement imprecis avec la discipline de I'ingenierie souleve des questions quant a la pertinence et la representativite des resultats obtenus par ces methodes selon le point de vue de I'ingenierie. En outre, bien que de nombreuses methodes SPA actuellement disponibles offrent de I'aide et des orientations; elles n'adressent malheureusement que partiellement les elements juges essentiels au succes de la realisation du SPA. Cette these presente et discute revaluation de methodes SPA. L'evaluation proposee dans cette these comprend deux volets: revaluation des methodes SPA en utilisant des approches de conception descendante sur la base du point de vue de I'ingenierie ainsi qu'une approche de bas en haut pour evaluer le succes des methodes de SPA. La theorie de revaluation des concepts est utilisee comme un cadre de reference pour developper formellement les deux methodes d'evaluation. Pour developper une premiere methode d'evaluation en utilisant I'approche descendante, une etude exploratoire analytique des methodes SPA a partir d'un point de vue de I'ingenierie de conception a ete realisee. La classification de Vincenti a ete utilisee comme un instrument d'analyse. L'objectif de cette etude exploratoire est de placer les methodes SPA developpees dans le cadre de reference de I'ingenierie de conception et d'utiliser ce cadre de reference comme ligne directrice pour permettre de placer les nouvelles methodes SPA a I'etat de conception dans un meme cadre reference en conception d'ingenierie. Pour developper la deuxieme methode d'evaluation en utilisant I'approche de bas en haut, un examen systematique de la litterature a ete realisee pour extraire I'ensemble des elements necessaires au succes des methodes de SPA fondees sur les exigences, les observations, les le9ons apprises et les recommandations qui ont ete experimentees dans I'industrie et publiees dans des livres, des conferences et des revues. Le processus de developpement des deux methodes d'evaluation a ensuite ete verifie en utilisant un ensemble de criteres de verification. Ensuite, la proposition des methodes d'evaluation a ete testee par la realisation de trois etudes de cas. La premiere methode d'evaluation serait surtout utile pour les concepteurs de la nouvelle methode de SPA au cours de la phase de conception, tandis que la seconde methode d'evaluation serait utile pour les concepteurs et les praticiens des methodes SPA pour verifier le succes de la methode d'evaluation en question. Ce projet de recherche constitue un point d'entree pour etudier I'alignement des methodes du SPA de conception avec les principes de conception de I'ingenierie. Ce projet de recherche fait aussi la lumiere sur la realisation reussie des resultats d'evaluation en etudiant le succes des elements qui doivent etre supportees par des methodes d'evaluation separe des processus d'amelioration. Les methodes d'evaluation proposees dans cette these ont de grands avantages pour les methodes de SPA con9ues avant tout pour les TPE parce que ces methodes d'evaluations, contrairement aux methodes plus connues, ne sont pas encore appuyees par des etudes completes sur leur fiabilite et leur efficacite.
APA, Harvard, Vancouver, ISO, and other styles
20

Siu, Sun Chau. "New methods to evaluate the effects of fouling on process chromatography." Thesis, University College London (University of London), 2005. http://discovery.ucl.ac.uk/1446675/.

Full text
Abstract:
This thesis examines new approaches to evaluate the effects of fouling on process chromatography. Fouling can have a serious, negative impact on the performance of chromatography and considerable effort is normally spent to prevent fouling species reaching the column, or in developing clean-in-place (CIP) protocols of ever increasing complexity to mitigate their effects. Despite this, the knowledge of chromatographic fouling often seems anecdotal, with only a few systematic investigations currently reported in literature. Furthermore, conventional approaches to investigate chromatographic fouling only provide an overall indication of their state. New approaches to investigate fouling at increasingly fine detail are studied in this thesis and provide valuable insights to the mechanism of fouling. At the whole-column level, the method of frontal analysis was used to determine the effects of fouling a packed bed column (DEAE Sepharose FF) with yeast homogenate. The shape and position of breakthrough curves generated by frontal analysis were used to quantitatively assess the impact of fouling on binding capacity and to qualitatively infer the overall changes in mass transfer properties. In particular, the effects of solids particulate and different modes of applying the fouling stream to the column were examined. Breakthrough curve analysis was also used to investigate the effectiveness of a rigorous CIP procedure in restoring the characteristics of a fouled column. An extended reverse-flow technique using an acetone tracer was then developed to quantify the dispersive effects of fouling on defined axial sections within a packed bed column, giving more than an overall indication of the fouling condition. The influence of column diameter, bed length and two different header designs on the extent of fouling were examined. The technique allows the band broadening effects due to reversible macroscopic factors, such as flow maldistribution in the flow distributor and inside the packed bed caused by packing heterogeneity, to be separated from irreversible microscopic factors, such as intraparticle diffusion, external fluid film mass transfer and interparticle axial dispersion. It was shown to be a simple, nondestructive method for investigating chromatographic fouling at an intra-column level. Finally, confocal scanning laser microscopy (CSLM) was proven to be a powerful technique to directly visualise fouling at a single-bead level. A particularly aggressive fouling stream of partially clarified E. coli homogenate was used to challenge an anion exchange resin (Q Sepharose FF) in a finite bath, and subsequently in a packed structure under flow conditions. The fouling caused by the material was visualised by fluorescently labelling DNA and host cell proteins in the fouling stream and by measuring the binding capacity and uptake rate for a fluorescently-labelled test protein, BSA. The use of CSLM also allowed the applications of various CIP procedures to be visually followed. The competitive adsorption of whole cells or cell debris and DNA to Q Sepharose FF has also been visualised. Confocal images obtained provide insights to the spatial distribution of key foulant types within a single bead. This thesis concludes with recommendations for future work which will seek to extend the analysis to situations where fouling occurs in a flow situation by the design of appropriate flow cells and methods of analysis.
APA, Harvard, Vancouver, ISO, and other styles
21

Grego, Mayor Jaime. "Defining a method to evaluate Boards of Directors efectiveness." Doctoral thesis, Universitat Internacional de Catalunya, 2017. http://hdl.handle.net/10803/580598.

Full text
Abstract:
The purpose of this thesis is to present a new board evaluation method which takes into account both qualitative and quantitative criteria and which meets three key requirements, namely being systhematic, specific and objective. Board theories, board evaluation methods and their current use and effectiveness are reviewed. Finally, a new method is proposed and supported.
La finalitat d’aquesta tesi és la de presentar un nou mètode d’avaluació de consells tenint en compte criteris tant qualitatius com quantitatives i acomplint tres requeriments clau : sistematització, especificació I objectiu. Es revisen teories sobre consells I mètodes d’avaluació tant el seu ús com la seva efectivitat. Finalment, es proposa i justifica un nou mètode.
APA, Harvard, Vancouver, ISO, and other styles
22

Martinsson, Emma. "To evaluate fire properties of a facade : - a study on semi natural test methods." Thesis, Luleå tekniska universitet, Institutionen för samhällsbyggnad och naturresurser, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-68085.

Full text
Abstract:
Due to an increase in the number of large-scale façade fires around the world the interest in the fire hazards of façades has also increased. The hazards of such fires have been acknowledged since earlier and many different test methods have therefore been developed to enable evaluation of the fire properties of wall assemblies. The purpose of this study is to map differences and similarities between existing full-scale test methods that are currently used to evaluate façade systems based on their performance when influenced by fire. The study also includes a review of previous research into parameters and conditions that influence a façade fire. Some past incidents will be used to enable comparison between the tests and a real fire scenario. The review of previous research has been focused in the areas fire spread, fire properties influencing the heat flux from a fire, and comparative studies evaluating differences between existing test methods. For the study of parameters in existing test methods 21 test methods have been identified and included in the study. A detailed compilation of information on each of the included test methods can be found in the tables included in appendix A. The mapping of the differences and similarities of the included methods has resulted in comparisons of wall and specimen specifications, ignition source parameters, measuring points and approval criteria. The increase in interest for façade fire hazards has also led to some new methods being developed and old methods being revised. New methods and unfinished revisions have not been included in the comparison study but are mentioned.  The conclusion of this study is that although the variation between the tests on the detailed methodology level are very high, if you consider the conditions used for approval and evaluation, they can all be linked to identified hazards of façade fires. However, there are some parameters that need to be re-examined. Previous research indicates that the fire load may be the parameter with most influence on the fire scenario. This is one of the parameters that vary the most and at same time it is likely to contribute to different results between the test methods. Another parameter is the influence of wall openings representing windows to overlaying compartments. This is a parameter that could have a significant effect on the outcome of a test.
APA, Harvard, Vancouver, ISO, and other styles
23

Wright, Davene. "Examining Methods Used to Evaluate the Cost-Effectiveness of Childhood Obesity Interventions." Thesis, Harvard University, 2012. http://dissertations.umi.com/gsas.harvard:10266.

Full text
Abstract:
This dissertation examines methods used to evaluate the cost-effectiveness of childhood obesity interventions in order to help decision-makers prioritize among competing health programs using standardized outcomes. Chapter 1 generates inputs for use in cost-effectiveness analyses (CEAs) of childhood obesity interventions. In Chapter 1.1, I use data from the Medical Expenditure Panel Survey to predict expenditures associated with obesity in childhood and adolescence. I found that obese children and adolescents have significantly different expenditures than their normal weight counterparts. I conclude that exclusion of obesity-related medical expenditures can potentially undervalue the cost-effectiveness of interventions. In Chapter 1.2, I use data from the Study of Early Child Care and Youth Development to examine the longitudinal trajectory of child weight. I derived probabilities of transitioning between weight classes that can be used in a decision-analytic model to extrapolate the effectiveness of childhood obesity interventions beyond childhood. I found that deviating from CDC BMI reference categories can more accurately capture the risk of future obesity. In Chapter 2, I evaluate the cost-effectiveness of a primary care-based obesity prevention program, High Five for Kids. Over two years, High Five for Kids was low-cost, but only marginally effective in reducing BMI. I used a decision analytic simulation model to extrapolate trial outcomes over a 10-year horizon, and found that in the long-term, primary care based obesity prevention was likely to be cost-effective relative to usual care. I also found that key methodological considerations can meaningfully influence the cost-effectiveness of childhood obesity interventions. In Chapter 3, I develop an agent-based model to explore the dynamics of the potential spread of obesity within families. I found that the “contagion” of obesity could result in significant collateral weight loss in family members not targeted in an intervention. As a result, CEAs may underestimate the benefits of obesity interventions. Moreover, I found that unless interventions are targeted toward all obese children in a family, the contagion of obesity can hinder weight loss in intervention targets. This model can be leveraged as a tool to optimize family-based obesity intervention strategies and inform randomized controlled obesity prevention trials.
APA, Harvard, Vancouver, ISO, and other styles
24

Bidesi, Anup Singh. "Comparison of texture classification methods to evaluate spongy bone texture in osteoporosis /." free to MU campus, to others for purchase, 2004. http://wwwlib.umi.com/cr/mo/fullcit?p1422912.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Tran, Huong Thi. "Framework to Evaluate Entropy Based Data Fusion Methods in Supply Chain Management." Thesis, University of North Texas, 2016. https://digital.library.unt.edu/ark:/67531/metadc955034/.

Full text
Abstract:
This dissertation explores data fusion methodology to deduce an overall inference from the data gathered from multiple heterogeneous sources. Typically, if there existed a data source in which the data were reliable and unbiased, then data fusion would not be necessary. Data fusion methodology combines data form multiple diverse sources so that the desired information - such as the population mean - is improved despite redundancies, inaccuracies, biases, and inflated variability in the data. Examples of data fusion include estimating average demand from similar sources, and integrating fatality counts from different media sources after a catastrophe. The approach in this study combines "inputs" from distinct sources so that the information is "fused." Another way of describing this process is "data integration." Important assumptions are 1. Several sources provide "inputs" for information used to estimate parameters of a probability distribution. 2. Since distributions for the data from the sources are heterogeneous, some sources are less reliable. 3. Distortions, bias, censorship, and systematic errors may be more prominent in data from certain sources. 4. The sample size of sources data, number of "inputs," may be very small. Examples of information from multiple sources are abundant: traffic information from sensors at intersections, multiple economic indicators from various sources, demand data for product using similar retail stores as sources, polling data from various sources, and disaster count of fatalities from different media sources after a catastrophic event. This dissertation seeks to address a gap in the operations literature by addressing three research questions regarding entropy base data fusion (EBDF) approaches to estimation. Three separate, but unifying, essays address the research questions for this dissertation. Essay 1 provides an overview of supporting literature for the research questions. A numerical analysis of airline maximum wait time data illustrates the underlying issues involved in EBDF methods. This essay addresses the research question: Why consider alternative entropy-based weighting methods? Essay 2 introduces 13 data fusion methods. A Monte Carlo simulation study examines the performance of these methods in estimating the mean parameter of a population with either a normal or lognormal distribution. This essay addresses the following research questions: 1. Can an alternative formulation for Shannon's entropy enhance the performance of Sheu (2010)'s data fusion approach? 2. Do symmetric and skewed distributions affect the 13 data fusion methods differently? 3. Do negative and positive biases affect the performance of the 13 methods differently? 4. Do entropy based data fusion methods outperform non-entropy based data fusion methods? 5. Which data fusion methods are recommended for symmetric and skewed data sets when no bias is present? What is the recommendation under conditions of few data sources? Essay 3 explores the use of the data fusion method estimates of the population mean in a newsvendor problem. A Monte Carlo simulation study investigates the accuracy of the using the estimates provided in Essay 2 as the parameter estimate for the distribution of demand that follows an exponential distribution. This essay addresses the following research questions: 1. Do data fusion methods with relatively strong performance in estimating the parameter mean estimate also provide relatively strong performance in estimating the optimal demand under a given ratio of overage and underage costs? 2. Do any of the data fusion methods deteriorate or improve with the introduction of positive and negative bias? 3. Do the alternative entropy formulations to Shannon's entropy enhance the performance of the methods on a relative basis? 4. Is the relative rank ordering performance of the data fusion methods different in Essay 2 and Essay 3 in the resulting performances of the methods? The contribution of this research is to introduce alternative EBDF methods, and to establish a framework for using EBDF methods in supply chain decision making. A comparative Monte Carlo simulation analysis study will provide a basis to investigate the robustness of the proposed data fusion methods for estimation of population parameters in a newsvendor problem with known distribution, but unknown parameter. A sensitivity analysis is conducted to determine the effect of multiple sources, sample size, and distributions.
APA, Harvard, Vancouver, ISO, and other styles
26

Barrois, Benjamin. "Methods to evaluate accuracy-energy trade-off in operator-level approximate computing." Thesis, Rennes 1, 2017. http://www.theses.fr/2017REN1S097/document.

Full text
Abstract:
Les limites physiques des circuits à base de silicium étant en passe d'être atteintes, de nouveaux moyens doivent être trouvés pour outrepasser la fin de la loi de Moore. Beaucoup d'applications peuvent tolérer des approximations dans leurs calculs à différents niveaux, sans dégrader la qualité de leur sortie, ou en la dégradant de manière acceptable. Cette thèse se concentre sur les architectures arithmétiques approximatives afin de saisir cette opportunité. Tout d'abord, une étude critique de l'état de l'art des additionneurs et multiplieurs approximatifs est présentée. Ensuite, un modèle de propagation d'erreur virgule-fixe mettant en œuvre la densité spectrale de puissance est proposée, suivi d'un modèle de propagation du taux d'erreur binaire positionnel des opérateurs approximatifs. Les opérateurs approximatifs sont ensuite utilisés pour la reproduction des effets de la VOS dans les opérateurs arithmétiques exacts. Grâce à notre outil de travail open-source ApxPerf et ses bibliothèques synthétisables C++ apx_fixed pour les opérateurs approximatifs et ct_float pour l'arithmétique flottante basse consommation, deux études consécutives sont proposées, basées sur des applications de traitement du signal complexes. Tout d'abord, les opérateurs approximatifs sont comparés à l'arithmétique virgule-fixe, et la supériorité de la virgule-fixe est soulignée. Enfin, la virgule fixe est comparée aux petits flottants dans des conditions équivalentes. En fonction des conditions applicatives, la virgule-flottante montre une compétitivité inattendue face à la virgule-fixe. Les résultats et discussions de cette thèse donnent un regard nouveau sur l'arithmétique approximative et suggère de nouvelles directions pour le futur des architectures efficaces en énergie
The physical limits being reached in silicon-based computing, new ways have to be found to overcome the predicted end of Moore's law. Many applications can tolerate approximations in their computations at several levels without degrading the quality of their output, or degrading it in an acceptable way. This thesis focuses on approximate arithmetic architectures to seize this opportunity. Firstly, a critical study of state-of-the-art approximate adders and multipliers is presented. Then, a model for fixed-point error propagation leveraging power spectral density is proposed, followed by a model for bitwise-error rate propagation of approximate operators. Approximate operators are then used for the reproduction of voltage over-scaling effects in exact arithmetic operators. Leveraging our open-source framework ApxPerf and its synthesizable template-based C++ libraries apx_fixed for approximate operators, and ct_float for low-power floating-point arithmetic, two consecutive studies are proposed leveraging complex signal processing applications. Firstly, approximate operators are compared to fixed-point arithmetic, and the superiority of fixed-point is highlighted. Secondly, fixed-point is compared to small-width floating-point in equivalent conditions. Depending on the applicative conditions, floating-point shows an unexpected competitiveness compared to fixed-point. The results and discussions of this thesis give a fresh look on approximate arithmetic and suggest new directions for the future of energy-efficient architectures
APA, Harvard, Vancouver, ISO, and other styles
27

Johansson, Anders Sture. "Develop Methods To Evaluate the Performance of Aflatoxin Sampling Plans for Shelled Corn." NCSU, 1998. http://www.lib.ncsu.edu/theses/available/etd-19981228-110754.

Full text
Abstract:

Eighteen lots of shelled corn were tested for aflatoxin contamination. The variability and distributional characteristics associated with the aflatoxin testing procedure were investigated. The total variance associated with testing shelled corn was estimated and partitioned into sampling, sample preparation, and analytical variances. All variances were found to increase with an increase in aflatoxin concentration. Using regression analysis, mathematical expressions were developed to model the relationship between aflatoxin concentration and the total, sampling, sample preparation, and analytical variances. The expressions for these relationships were used to estimate the variance for any sample size, subsample size, and number of analyses for a specific aflatoxin concentration. For example, testing a lot with 20 parts per billion (ppb) aflatoxin using a 2.5 lb sample, Romer mill and 50 g subsample, and HPLC analysis, the total, sampling, sample preparation, and analytical variances are 274.9 (CV=82.9%), 214.0 (CV=73.1%), 56.3 (CV=37.5%), and 4.6 (CV=10.7%), respectively. The percentage of the total variance for sampling, sample preparation, and analytical is 77.8, 20.5, and 1.7 %, respectively. Next, fifteen positively skewed distributions were each fitted to 18 empirical distributions of aflatoxin test results for shelled corn. The compound gamma distribution was selected to model the sample aflatoxin test results for shelled corn. The method of moments technique was chosen to estimate the parameters of the compound gamma distribution. Mathematical expressions were developed to calculate the parameters of the compound gamma distribution for any lot aflatoxin concentration and test procedure. Observed acceptance probabilities were compared to operating characteristic curves predicted from the compound gamma distribution and all 18 distributions of sample aflatoxin test results were found to lie within a 95% confidence band. Using the mean and variance relationships to compute the parameters of the compound gamma distribution, 16 sampling plans, based on four sample sizes and four sample acceptance levels were created and analyzed. For a given sample size, decreasing the sample acceptance level, using a sample acceptance level equal to the regulatory guideline: (a) decreases the percentage of lots accepted while increasing the percentage of lots rejected at all aflatoxin concentrations; (b) increases misclassification of lots (both false positives and false negatives) while decreasing the percentage of correct decisions; and (c) decreases the average aflatoxin concentration in the lots accepted and lots rejected. For a given sample size where the sample acceptance level is less than the regulatory guideline, the number of false positives increases and the number of false negatives decreases when compared to the situation where the sample acceptance level equals the regulatory guideline. For a given sample size, where the sample acceptance level is greater than the regulatory guideline, the number of false positives decreases and the number of false negatives increases when compared to the situation where the sample acceptance level equals the regulatory guideline. Increasing the sample size for a given sample acceptance level, where the legal limit equals the sample acceptance level: (a) increases the percentage of lots accepted at lower concentrations while increasing the percentage of lots rejected at higher concentrations; (b) decreases misclassification of lots (both false positives and false negatives) while increasing the percentage of correct decisions; and (c) decreases the average aflatoxin concentration in the lots accepted while increasing the average aflatoxin concentration in the rejected lots.

APA, Harvard, Vancouver, ISO, and other styles
28

Holder, Mark Travis. "Using a complex model of sequence evolution to evaluate and improve phylogenetic methods." Access restricted to users with UT Austin EID Full text (PDF) from UMI/Dissertation Abstracts International, 2001. http://wwwlib.umi.com/cr/utexas/fullcit?p3037500.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Olofsson, Jonas. "Base cations in forest soils : A pilot project to evaluate different extraction methods." Thesis, Institutionen för mark och miljö, Sveriges lantbruksuniversitet, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-308442.

Full text
Abstract:
The acidification has been a known problem in Sweden for several decades. Sulphurous compounds, spread from the British Isles and the European continent led to a decrease in the pH-value of the rain that fell over Sweden. Since the acidification was discovered in the 1960s, active measures against the sulphurous deposition have been undertaken. The sulphurous deposition has decreased by 90 %, and the problem was for some time considered under control, until recently when a new era of the acidification may have started. Due to the increased demand of renewable energy, and Sweden’s potential to use biomass instead of fossil fuels, whole tree harvesting has been more utilized. Studies indicate that the forest soils are depleted in base cations in a faster rate when whole tree harvesting is performed compared to regular stem harvesting. Mass balance calculations and simulations indicate that an increased bio uptake of base cations due to whole tree harvesting leads to an increased biological acidification. However, although many studies agree that the impact of the whole tree harvest on the base cation supply of the soils is significant, long running Swedish experiments indicate that the difference between whole tree harvesting and regular stem harvesting diminishes over time. After a 40 year period, the difference in base cation supply between whole tree harvested soils and stem harvested soils are small. The reason for this could be different processes that reallocate base cations from different pools, which are not usually studied. The aim has been to investigate and evaluate different chemical extraction methods (Aqua Regia, HCl, EDTA, BaCl2, NH4OAc and water) capability to extract the base cations calcium, potassium, magnesium and sodium from four different Swedish forest soils and what this means for our understanding of how much base cations a soil contains. The extractions indicated that there is a statistical significant difference between the methods ability to extract base cations. Generally Aqua Regia was the most potent method, followed by HCl, EDTA, BaCl2, NH4OAc and water in decreasing order of effectiveness to extract the base cations. Linear correlations were found between EDTA, BaCl2 and NH4OAc. The internationally widely used method NH4OAc was considered to be at risk of underestimating the amount of base cations in the soil.
Försurningsproblematiken har länge varit ett känt problem i Sverige. Svavelhaltiga föroreningar som spreds från de brittiska öarna och den europeiska kontinenten ledde till att pH-värdet i regnet som föll över Sverige sjönk. Sedan upptäckten på 60-talet har aktiva åtgärder vidtagits mot utsläppen vilket har lett till en minskning av de försurande föroreningarna med 90 %. På grund av den stora utsläppsreduktionen som skett ansågs försurningsproblematiken vara under kontroll, tills nyligen då en ny etapp av för- surningen kan ha påbörjats. På grund av den ökande efterfrågan av förnyelsebar energi, i kombination med Sveriges stora skogstillgångar, har helträdsskörd av träd blivit alltmer nyttjad. Studier visar att markens baskatjonförråd utarmas i större utsträckning av helträdsskörd, då även grenar, rötter och toppar tas om hand jämfört med vanlig stamskörd då endast stammen tas med från skogen. Massbalanssimuleringar antyder att ett ökat bioupptag av baskatjoner på grund av helträdsskörd leder till en ökad biologisk försurning. Trots att många studier är överens om helträdsskördens inverkan på markens innehåll av baskatjoner visar lång- liggande försök i Sverige att skillnaderna mellan uttag av hela träd och stamved minskar med tiden. Efter en period på 40 år återstår endast små skillnader mellan avverknings- metoderna. Orsakerna till varför mätningarna och massbalansberäkningarna och simuleringarna inte stämmer överens kan vara många, t.ex. att det finns processer som kan omfördela baskatjoner från de som vanligtvis studeras. Syftet har varit att undersöka och utvärdera olika kemiska extraktionsmetoders (Aqua Regia, HCl, EDTA, BaCl2, NH4OAc och vatten) förmåga att extrahera baskatjonerna kalcium, kalium, magnesium och natrium från fyra olika skogsjordar i Sverige och vad resultaten betyder för vår uppfattning av mängden baskatjoner i marken. Extraktionerna visade att en statistiskt signifikant skillnad fanns mellan metodernas förmåga att extrahera de olika baskatjonerna. Generellt var Aqua Regia den metod som extraherade den största mängden baskatjoner, HCl, EDTA, BaCl2, NH4OAc och vatten följde i fallande ordning efter förmåga att extrahera baskatjonerna. Linjära korrelationer mellan EDTA, BaCl2 och NH4OAc upptäcktes. Den internationellt ofta använda metodiken för att extrahera baskatjoner, NH4OAc, ansågs riskera att underskatta mängden baskatjoner i marken.
APA, Harvard, Vancouver, ISO, and other styles
30

Karlsson, Niklas. "Method to evaluate technical solutions Handle external loads on subsea wellheads." Thesis, KTH, Hållfasthetslära (Avd.), 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-252771.

Full text
Abstract:
A method to evaluate technical solutions to handle external loads on subsea wellheadshas been developed. The solutions, or concepts, are compared with respect to load relief,cost and operation. As a basis, a Pugh matrix was used. It is well proven and commonlyused amongst engineers to evaluate concepts. However, it has some major cons due toits simplicity.Two more layers were added trying to solve or minimize these cons. This made up a totalof three layers.I. Evaluation - gather concept data, answer questions with valuesII. Transformation - transforms gathered values to a [1-5] scalingIII. Comparison - scaled values are presented in a Pugh matrixIn layer I, questions are to be answered by analyses and expert knowledge, carried outby developers.As for layer II, a value-scaling relationship should be set by developers and decisionmakers. They decide what is a good difference compared with a reference, and what isnot. The values for layer I can then be translated to layer III.Lastly, in layer III the performances of concepts with respect to different criteria are statedin a Pugh matrix. A scaling [1-5] is used for this. The decision maker decides what criteriaare most important by weighting them.Besides that, everything could be made automated. So when the method was carried outon two concepts, a winner could be decided immediately in layer III when the questionsin layer I had been answered.A simple and straightforward method to compare concepts have been done. Visualizingthe concept evaluation process and the connection between developers and decisionmakers. Making it easier for them to understand one another.The method can continuously be improved over time and might have the potential to makethe development process in many companies leaner.
APA, Harvard, Vancouver, ISO, and other styles
31

Stinson, Jesse. "A quantitative method to evaluate the effect of xylanases in baking." Thesis, Kansas State University, 2012. http://hdl.handle.net/2097/19752.

Full text
Abstract:
Master of Science
Food Science Institute
Fadi Aramouni
β-(1,4)-endoxylanases, commonly referred to as xylanases, have become integral to the industrial breadmaking process. This enzyme is known to cause improvement in dough rheology, loaf volume, and crumb grain. Significant research has been conducted regarding the structure, function, and inhibition of xylanases, but there is currently no quick and reproducible method to evaluate their effect in baking. The goal of this research was to develop a quantitative method for this purpose and to determine why the effect of xylanases varies with different wheat flours. The currently used methods of test baking, dough stickiness, and spectrophotometric analysis for reducing sugars were evaluated, and failed to provide reproducible results. Therefore, a new method was developed to measure the Flour Water Expression Rate (FWER) with the addition of xylanases. Commercially available enzymes from Aspergillus niger and Bacillus subtilis were evaluated in this study. The FWER method measures the amount of water released by the xylanase over a set period of time. This method consistently provided statistically significant data (p<0.05), which was able to provide a comparison of xylanases from A. niger and B. subtilis in different flours. The results indicated that the xylanase from A. niger tends to release more water, have a higher FWER value, than the xylanase from B. subtilis. In one flour, A. niger xylanase resulted in an FWER of 15.18 compared to B. subtilis xylanase that resulted in an FWER of 9.57 at equivalent activities. However, inhibitors in the wheat appeared to cause an impact on the FWER, which was evaluated with an uninhibited xylanase from B. subtilis. This new method for the evaluation of xylanases in baking suggests varying levels of xylanase inhibitors in wheat may be the reason xylanases effect wheat flours differently.
APA, Harvard, Vancouver, ISO, and other styles
32

Eriksson, Tony. "Creating a method to evaluate frameworks used to build web applications." Thesis, Umeå universitet, Institutionen för datavetenskap, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-172360.

Full text
Abstract:
Single page applications are a new kind of applications, these run in almost any web-browser and can therefore use the same code-base independent of the platform that they are used on. To create a Single Page Application a framework for development is often used. Today there exists a wide variety of such frameworks, most with their own specialties and drawbacks. How well different development frameworks made to create Single Page Applications fits a project depends on the situation. This work is creating a framework to evaluate how well a development framework fits in any given project. Blazor are a new framework that can be used to create Single Page Applications. Once the evaluation framework were defined it was used to evaluate how good Blazor would be if it would be used to develop a specific application. By doing this the framework are tested and at the same time knowledge about Blazor are obtained. The work is concluded by application of the developed framework and showing that Blazor are a good fit given the scenario.This evaluation also demonstrates the power of the evaluation framework and leads to the conclusion that it is a good tool that can help strengthen decision on whatever or not to use a specific framework.
APA, Harvard, Vancouver, ISO, and other styles
33

Armour, Arthur David 1964. "Adaptive random search evaluated as a method for calibration of the SMA-NWSFS model." Thesis, The University of Arizona, 1990. http://hdl.handle.net/10150/278394.

Full text
Abstract:
Random search methods are becoming more widely used to estimate model parameters. Their ability to globally search a parameter space makes them attractive for solving problems that have multi-local optima, as are non-linear hydrologic models such as Conceptual Rainfall-Runoff (CRR) models. The investigation of this thesis is on the ability of the Adaptive Random Search (ARS) to find the global optimum of the CRR model known as the Soil Moisture Accounting Model of the National Weather Service River Forecast System (SMA-NWSRFS) and compares its performance to that of the Uniform Random Search (URS). Research results indicate that, although the ARS was slightly more efficient than the URS, neither strategy demonstrated an ability to converge to the globally optimum parameter set. Factors which inhibit the convergence include model structure characteristics and an insufficient number of points searched. Ways for random search techniques to identify and address these problems are discussed.
APA, Harvard, Vancouver, ISO, and other styles
34

Rickardson, Linda. "New Methods to Screen for Cancer Drugs and to Evaluate their Mechanism of Action." Doctoral thesis, Uppsala University, Clinical Pharmacology, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-8440.

Full text
Abstract:

Cancer is a common disease and due to problems with resistance against cancer drugs and the limited benefit from chemotherapy in many diagnoses, there is a need to develop new cancer drugs. In this thesis new methods to screen for cancer drugs and to evaluate their mechanism of action are discussed.

In Paper I, it was found that by studying the gene expression of a cell line panel and combining the data with sensitivity data of a number of cytotoxic drugs, it was possible to cluster compounds according to mechanism of action as well as identifying genes associated with chemosensitivity.

In Paper II, studies of compounds with selective activity in drug-resistant cell lines revealed the glucocorticoids as a group of interesting compounds. The glucocorticoid receptor was overexpressed in 8226/Dox40 and the difference in sensitivity was abolished when the cells were treated with a glucocorticoid receptor antagonist.

In Paper III, an image-based screening method for new proteasome inhibitors was successfully developed and the compounds disulfiram, PDTC and NSC 95397 were identified as inhibitors of the proteasome.

In Paper IV, disulfiram and PDTC were shown to induce cytotoxic activity, to inhibit the activation of the transcription factor NFkappaB and to inhibit the degradation of proteins normally degraded by the proteasome.

In Paper V, NSC 95397 was shown to be cytotoxic to all cells in the resistance-based cell line panel as well as to patient samples from a variety of cancer diagnoses. Connectivity Map was successfully used as a tool to propose a new mechanism of action of NSC 95397. The gene expression induced by NSC 95397-treatment was similar to that induced by several proteasome inhibitors not present in the Connectivity Map.

APA, Harvard, Vancouver, ISO, and other styles
35

Ruparelia, Prina. "Novel methods to evaluate the impact of cigarette smoking and COPD on lung physiology." Thesis, University of Bristol, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.526057.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Beckman, Rehnman Jeannette. "New methods to evaluate the effect of conventional and modified crosslinking treatment for keratoconus." Doctoral thesis, Umeå universitet, Oftalmiatrik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-110531.

Full text
Abstract:
Background: Today corneal crosslinking with ultraviolet-A photoactivation of riboflavin is an established method to halt the progression of keratoconus. In some cases, when the refractive errors are large and the visual acuity is low, conventional corneal crosslinking may not be sufficient. In these cases it would be desirable with a treatment that both halts the progression and also reduces the refractive errors and improves the quality of vision. Aims:  The aims of this thesis were to determine whether mechanical compression of the cornea during corneal crosslinking for keratoconus using a sutured rigid contact lens could improve the optical and visual outcomes of the treatment, and also to find methods to evaluate the effect of different corneal crosslinking treatment regimens. Methods: In a prospective, open, randomized case-control study, 60 eyes of 43 patients with progressive keratoconus, aged 18-28 years, planned for routine corneal crosslinking, and a corresponding age- and sex-matched control group was included. The patients were randomized to conventional corneal crosslinking (CXL; n=30) or corneal crosslinking with mechanical compression of the cornea during the treatment (CRXL; n=30). Biomicroscopy, autorefractometry, best spectacle corrected visual acuity, axial length measurement, Pentacam® HR Scheimpflug photography, pachymetry, intraocular pressure measurements and corneal biomechanical assessments were performed before treatment (baseline) and at 1 month and 6 months after the treatment. One of the articles evaluated and compared the optical and visual outcomes between CXL and CRXL, while the other three articles focused on methods to evaluate treatment effects. In Paper I, the corneal light scattering was manually quantified from Scheimpflug images throughout the corneal thickness at 8 measurements points, 0.0 to 3.0 mm from the corneal centre, in patients treated with CXL. In Paper IV the corneal densitometry (light scattering) was measured with the Pentacam® HR software, in 4 circular zones around the corneal apex and at 3 different depths of the corneal stroma, in both CXL and CRXL treated corneas. Paper III quantified the biomechanical effects of CXL in vivo. Results: Corneal light scattering after CXL showed distinctive spatial and temporal profiles and Applanation Resonance Tonometry (ART) -technology demonstrated an increased corneal hysteresis 1 and 6 months after CXL. When comparing the refractive and structural results after CXL and CRXL, CRXL failed to flatten the cornea, and the treatment did not show any benefits to conventional CXL treatment, some variables even indicated an inferior effect. Accordingly, the increase in corneal densitometry was also less pronounced after CRXL. Conclusions: Analysis of corneal light scattering/densitometry shows tissue changes at the expected treatment location, and may be a relevant variable in evaluating the crosslinking effect. ART -technology is an in vivo method with the potential to assess the increased corneal hysteresis after CXL treatment. By refining the method, ARTmay become a useful tool in the future. Unfortunately, CRXL does not improve the optical and visual outcomes after corneal crosslinking. Possibly, stronger crosslinking would be necessary to stabilize the cornea in a flattened position.
APA, Harvard, Vancouver, ISO, and other styles
37

Hajiro, Takashi. "Analysis of clinical methods used to evaluate dyspnea in patients with chronic obstructive pulmonarydisease." Kyoto University, 2001. http://hdl.handle.net/2433/150187.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Tavel, Léonie von. "Influence of management factors on methods to evaluate the nutritional and metabolic status in dairy herds and comparison of these methods /." [S.l.] : [s.n.], 2003. http://www.zb.unibe.ch/download/eldiss/03tavel_l.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Oudshoorn, Bodil. "Development of a test method to evaluate laceration risk of studded footwear." Thesis, Sheffield Hallam University, 2018. http://shura.shu.ac.uk/22416/.

Full text
Abstract:
Studded footwear has previously caused a number of severe laceration injuries in rugby union. Current test methods for assessing the laceration injury risk of rugby stud designs are unrepresentative of the game and are not mandatory for manufacturers to follow. The aim of this project was to develop a new, game-representative test method to assess the laceration injury risk of stud designs used in rugby union. First, the prevalence of skin and laceration injuries in rugby union was assessed through a systematic literature review of epidemiological studies. It was found that 2.4 skin injuries occurred per 1000 match hours, which could be interpreted as one time-loss injury per team, per season. A survey study of 191 rugby players was then conducted, indicating that stamping in the ruck was the most prevalent cause of stud laceration injuries. Following this, twelve participants were asked to perform stamping impacts in a simulated rucking scenario. Three-dimensional shoe kinematics and individual stud kinetics were measured for each impact. Two key phases were identified: an initial impact phase, and a subsequent raking phase. A two-phase mechanical test method was developed based on the results of the stamping study. In the initial impact phase, the stud is attached to a pendulum impacting a skin simulant. The velocity, stud angle and mass of the impact can be adjusted. The stud and skin simulant are then moved to the second phase, performing a controlled rake. In this phase, raking speed, stud angle and stud mass can be changed. Finally, six studs were compared on their predicted laceration injury risk using the developed method. Four of the tested studs were bespoke designs incorporating different edge radii and top diameters. The developed test method showed an increased laceration injury risk when stud edge radius or top diameter was reduced. Two of the tested studs were commercially available designs which had previously passed rugby union's current studded footwear tests. One of commercial studs showed an increased risk of laceration in the developed test method. Future research should focus on improving the developed test method's validity and investigating the influence of stud material, shape and wear on laceration injury risk.
APA, Harvard, Vancouver, ISO, and other styles
40

Xu, Quan. "A Method to Evaluate the Interfacial Friction Between Carbon Nanotubes and Matrix." University of Akron / OhioLINK, 2011. http://rave.ohiolink.edu/etdc/view?acc_num=akron1302202094.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Mulaka, Brahmananda Reddy. "DEVELOPMENT OF A METHOD TO EVALUATE WRINKLING TENDENCY OF INK-JET PAPERS." Miami University / OhioLINK, 2005. http://rave.ohiolink.edu/etdc/view?acc_num=miami1126193926.

Full text
APA, Harvard, Vancouver, ISO, and other styles
42

吳少輝 and Siu-fai Ng. "The early warning system of debt servicing difficulties of a country, by using statistical method to evaluate economic, social and politicalfactors." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1985. http://hub.hku.hk/bib/B31263343.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Sato, Shinya. "Development and application of methods to evaluate temporal changes in subsurface resistivity structures using magnetotellurics." Doctoral thesis, Kyoto University, 2021. http://hdl.handle.net/2433/263630.

Full text
APA, Harvard, Vancouver, ISO, and other styles
44

Holmgren, Mary. "A method to evaluate environmental enrichments for Asian elephants (Elephas maximus) in zoos." Thesis, Linköping University, The Department of Physics, Chemistry and Biology, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-11902.

Full text
Abstract:

Environmental enrichment (EE) is used to improve the life of captive animals by giving them more opportunities to express species-specific behaviours. Zoo elephants are one of the species that is in great need of EE because their environment is often barren. Before making EE permanent, however, it is wise to test first if it works as intended, to save time and money. Maximum price paid is one measure that can be used to assess if an animal has any interest in a resource at all. Food is often used as a comparator against EEs in these kinds of studies. The aim was to investigate if the maximum price paid concept could be used to measure the value of EEs for the two female Asian elephants at Kolmården and to find an operant test suitable for them for the experimental trials. Three series of food trials were done with each elephant, where they had to lift weights by pulling a rope with their mouth to get access to 5kg hay. The elephants paid a maximum price of 372 and 227kg, respectively. However, the maximum price the elephants paid for access to the hay was not stable across the three series of trials. Hence it is recommended that the comparator trials are repeated close in time to the EEs to be tested. The readiness by which these elephants performed the task makes it worthwhile to further pursue this approach as one of the means to improve the well-being of zoo elephants.

APA, Harvard, Vancouver, ISO, and other styles
45

Tsai, Yueh-hsun, and 蔡岳勳. "Capacitive Interface Circuits Evaluated by Integrating and Charging Methods." Thesis, 2008. http://ndltd.ncl.edu.tw/handle/54043962127563900522.

Full text
Abstract:
碩士
國立交通大學
電子工程系所
96
Capacitive sensors are widely adapted to various measuring equipments. The objective goal of this thesis is to design and analyze two different type of capacitive interface circuit and choose a structure to implement in discrete time model prototype chip. The architecture of integrating interface circuit is based on the switch-capacitor integrator. The charging and discharging speed varies with differential capacitance which senses the analog parameter. Through the comparator, the periodically charging and discharging behavior results in variation of duty cycle which achieve the readout of capacitance difference. The architecture can achieve high resolution. The architecture of charging interface circuit is based on switch-capacitor sample-and-hold circuit. During sample phase, the differential sensing capacitor sample two different reference voltage and redistribute them in hold phase and generate a stable output voltage. The architecture can achieve high speed, low cost, low power. In this thesis we complete the design and simulation of both architectures. From the discussion, a suitable architecture is chosen to be further implemented. The charging interface circuit is implemented with layout and discrete time model prototype chip. For 100pF nominate capacitance, the measuring range is ±100pF which correspond to voltage from 1.47V to 3.26V. The error of this capacitive interface circuit is less than 5%.
APA, Harvard, Vancouver, ISO, and other styles
46

Hung, Pi-hui, and 洪碧慧. "The blurry medical image evaluated by using objective methods." Thesis, 2010. http://ndltd.ncl.edu.tw/handle/39705500505475201897.

Full text
Abstract:
碩士
中臺科技大學
放射科學研究所
98
The purpose of this study was to seek an optimal method for measuring the image quality of MRI images after filtering. The study chose 80 slices brain image of MRI. They included T1WI, T2WI, T1 Flair and T2 Flair series. The original images were processed with the filter of the system, and to analyze them with objective method. In this work, the authors applied different level of filtering for brain MRI images. There were six objective methods to evaluate image quality, one was Pixel Based-Metric (PBM) and the other was Window Based-Metric (WBM). The PBM included Mean Square Error (MSE), Signal to Noise Ratio (SNR), Peak Signal to Noise Ratio (PSNR) meanwhile Universal Quality Index (UQI), Mean Structural Similarity (MSSIM), and Moran’s Peak Ratio (MPR) was for WBM. The PBM was very sensitive to image degradation but didn’t correlate well to subjective quality measures. The image spatial information was estimated from a local region by window based-metric, especially in MPR measurement. Apply the filter process by series, the researcher found that the image showed differences of MPR between T1and T2. The MPR was shown to be very sensitive to image quality of MRI filter, MPR was recommend as for measuring the image quality of MRI images after filtering.
APA, Harvard, Vancouver, ISO, and other styles
47

Huang, Chien-Hsun, and 黃建勳. "Using Backward-type Portfolio Selection Methods to Construct Optimal Portfolio Evaluated Index and Model." Thesis, 2004. http://ndltd.ncl.edu.tw/handle/74797563727817301130.

Full text
Abstract:
碩士
國立清華大學
工業工程與工程管理學系
92
Portfolio selection methods are developed in many fields. Many techniques and mathematical models are used to settle related problems based on mean-variance model developed in the stock markets. Many researches focus on evaluating items and formulate portfolio from good items and the methods belong to forward-type. On the contrary, this study aims to use “backward-type” portfolio selection method. In the perspective of backward-type selection, this thesis identifies the portfolio attributes into three categories such as independent, interrelated and synergistic portfolio attributes. Other than the mean-variance model considers the risk as the selected criteria. The thesis used the performance (i.e. future return) what the investor emphasized as the target. By the statistic of partial R squares from stepwise-regression method toward performance, the investors’ attitude (i.e. relative importance) of each attribute is obtained periodically and the evaluation index is constructed. Based on the index, the study then constructed multi-criteria mixed-integer quadratic programming model and quadratic programming by different definition of synergistic attributes to obtain invested position of stocks in the portfolio. Finally, This study will have illustrations in Taiwan Stock market and find that the backward-type selection methods, company profitability and synergistic attribute including in the model will have good performance.
APA, Harvard, Vancouver, ISO, and other styles
48

Chung, Hsin Cheng, and 鍾欣丞. "Comparison of the Evaluated Neutron Spectra between the Methods of Traditional and Extended-Range Bonner Spheres." Thesis, 2016. http://ndltd.ncl.edu.tw/handle/07818631392541069021.

Full text
Abstract:
碩士
國立清華大學
生醫工程與環境科學系
104
Nowadays, high-energy radiation particles are used in radiation therapy gradually. High-energy radiation particles always accompany with high-energy neutrons. To evaluate the distributions of neutron energies and fluences, the Bonner Sphere Spectrometer (BSS) is most commonly used to assess the neutron spectrum in international, but it is not good when particle energy exceed 10 MeV. High atomic number materials like lead, or copper is added to traditional BSS to increase (n, xn) cross-section, and then enhance the magnitude of the response functions in high energies. This new detection system is called Extended-Range Bonner Sphere Spectrometer (ERBSS). In this research, 7 different sizes BSS and self-made ERBSS including 2 lead sphere-shells and 2 polyethylene sphere-shells with gold foil were used to measure neutron spectra of standard neutron source Cf-252 and Tsing-Hua Open-pool Reactor (THOR). The neutron spectra of Cf-252 and THOR measured by means of traditional BSS and ERBSS were compared. The results showed that the neutron spectra of ERBSS and traditional BSS are similar in measuring standard Cf-252 source and THOR, and the ability of self-made ERBSS measuring neutron spectra was verified and proved. The results also showed that traditional BSS accompany with PbPE sphere-shell is more fit to real neutron spectrum. It can increase the calculation accuracy of program UMG 3.3, which calculate the neutron spectrum, when the sphere set is increased. The high-energy neutron facilities are going to be constructed and operated in the future, in Taiwan. By experimental verification of this study, the feasibility of using self-made ERBSS system to evaluate and analysis high-energy neutron spectrum is approved.
APA, Harvard, Vancouver, ISO, and other styles
49

Shih, Lanyi, and 施嵐依. "Using Quantitative Analysis Methods to Evaluate Supplier Performance." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/27743727342384156554.

Full text
Abstract:
碩士
國立臺北大學
企業管理學系
100
A supply chain involves a number of upstream and downstream units. How does such unit cooperate with each other to acquire maximum benefits is a vital issue related to operations performance of not only the whole supply chain system but also all individual units. By a case study of a technology company whose main activities are research, development and manufacture of a variety of modular computer system products, the study is undertaken in two stages to explore the topic in detail. In the first stage, the attributes and the weights provided by the technology company are analyzed quantitatively. Using the results to give the supplier sequence, and compare their difference. Based on the “projection” and “difference” values derived from DEA-Solver software calculation, the study gives some suggestions about performance improvement direction and magnitude. The second stage employs technology company director opinion and Kano Model to select 7 supplier attributes and weights after the supplier performance evaluation literature review, to conduct quantitative analysis, and to give the recommendations of performance improvement direction and magnitude. The interim result reflects that the most significant consistency appears between the consequence of simple additive weighting method and that of hierarchical additive weighting method. Finally, combining Kano Model with Importance-Performance Analysis, the study examines the advantages and disadvantages of every attribute for all suppliers, and then points out some instructions following Importance-Performance Analysis matrix for suppliers to formulate supply chain strategies. The study looks forward to raise the operations performance of the upstream-downstream linked supply chain.
APA, Harvard, Vancouver, ISO, and other styles
50

"A simulation approach to evaluate combining forecasts methods." Chinese University of Hong Kong, 1994. http://library.cuhk.edu.hk/record=b5888016.

Full text
Abstract:
by Ho Kwong-shing Lawrence.
Thesis (M.B.A.)--Chinese University of Hong Kong, 1994.
Includes bibliographical references (leaves 43-44).
ABSTRACT --- p.ii
TABLE OF CONTENTS --- p.iii
ACKNOWLEDGEMENT --- p.iv
CHAPTER
Chapter I. --- INTRODUCTION AND LITERATURE REVIEW --- p.1
Chapter II. --- COMBINING SALES FORECASTS --- p.7
Chapter III. --- EXPERIMENTAL DESIGN --- p.14
Chapter IV. --- SIMULATION RESULTS --- p.19
Chapter V. --- SUMMARY AND CONCLUSION --- p.27
APPENDIX --- p.31
BIBLIOGRAPHY --- p.43
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography