To see the other types of publications on this topic, follow the link: Techniques of review.

Dissertations / Theses on the topic 'Techniques of review'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Techniques of review.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lundgren, Mikael, and Ermin Hrkalovic. "Review of Displacement Mapping Techniques and Optimization." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4424.

Full text
Abstract:
This paper explores different bump mapping techniques and their implementation. Bump mapping is a technique that is used in computer games to make simple 3D objects look more detailed than what they really are. The technique involves using a texture to change the objects normals to simulate bumps and is used to avoid rendering high polygonal objects. Over the years some different techniques have been developed based on bump mapping, these are normal mapping, relief mapping, parallax occlusion mapping, quadtree displacement mapping and so on. The first part of this paper we go through our goals and our research methodology. We then write about four different techniques and describe how they work. We also go through how they are implemented. After that we start our experiments and measure the different techniques against each other. When the first testing has been done, we start to optimize the techniques and run a second test to see how much faster, if it is faster, the optimization is compared to the previous tests. When the tests are done, we present our test data and analyse them. Finally we discuss the techniques and the testing. Then we finish up with a conclusion.
Mikaels telefon: 072-181 77 29 Ermins telefon: 076-178 97 59
APA, Harvard, Vancouver, ISO, and other styles
2

Mercier, Laurence. "Review of ultrasound probe calibration techniques for 3D ultrasound." Thesis, McGill University, 2004. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=81555.

Full text
Abstract:
Three-dimensional (3D) ultrasound is an emerging new technology with numerous clinical applications like measuring the volume of the prostate, monitoring fetal development, or evaluating brain shift during neurosurgery. Ultrasound probe calibration is an obligatory step in order to build 3D volumes from 2D images acquired in a freehand ultrasound system. The role of calibration is to find the transformation that relates the image plane to a sensor attached on the probe. This thesis is a comprehensive review of what has been published in the field of ultrasound probe calibration for 3D ultrasound. The thesis covers the topics of tracking technologies, ultrasound image acquisition, phantom design, speed of sound issues, feature extraction, least-squares minimization, temporal calibration, calibration evaluation techniques and phantom comparisons. The calibration phantoms and methods have also been classified in tables to give a better overview of the existing methods.
APA, Harvard, Vancouver, ISO, and other styles
3

Baek, Youn Hyeong. "An experimental review of some aircraft parameter identification techniques." Thesis, Cranfield University, 1998. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.285023.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Smith, G. G. "A critical review of measurement techniques in coastal hydraulics." Master's thesis, University of Cape Town, 1990. http://hdl.handle.net/11427/8292.

Full text
Abstract:
Includes bibliographies.
This thesis reviews measurement techniques in the field of coastal hydraulics. Techniques of wave measurement are studied in detail 1 and the analysis of wave measurements is then dealt with. Particular attention is paid to the analysis of one-dimensional wave energy spectra. Two computer programs were adapted for use on a microcomputer for the analysis of these spectra by the autocorrelation and the Fast Fourier Transform techniques. Experiments were conducted to determine the effect of input parameters on the analysed spectrum; the sensitivity of the one-dimensional wave energy spectrum to the number of data points, the sampling interval' and the maximum lag number (for autocorrelation analysis), is illustrated. Guidelines and examples are included for the selection of appropriate parameters.
APA, Harvard, Vancouver, ISO, and other styles
5

Venter, Louis Johannes. "A review of Southern African kimberlites and exploration techniques." Thesis, Rhodes University, 1999. http://hdl.handle.net/10962/d1007278.

Full text
Abstract:
The dissertation reviews the present knowledge regarding diamonds, from its formation in the lithospheric upper mantle at depths between 150 and 300 km, to its final valuation in terms of US$/carat by diamantaires in London, Antwerp, Tel Aviv and New York. The dissertation is divided into two complimentary sections. Section one focuses on the formation, emplacement, occurrence and characteristics of kimberlites and, when present, their associated trace amounts of diamonds. The section follows a logical sequence from the regional tectonic-, local structrual- and geodynamic controls on kimberlite formation and emplacement to the characteristics of individual kimberlite morphology, mineralogy, petrography and geochemistry. Finally, the environment or diamond formation, resorption and the characteristics that have led to the marketability of diamonds are discussed. Section two reviews the current exploration techniques used in locating diamondiferous kimberliies and the subsequent economic evaluation of these kimberlites. A brief history of known Southern African kimberlite occurrences, grades, tonnages, tectonic settings, ages and regional structural controls is given. The prospective countries mentioned are Angola, Botswana, Lesotho, South Africa, Swaziland, Tanzania and Zimbabwe. Exploration techniques considered are ; the application of a landscape analysis and investigation of the surface processes active in a given area, indicator mineral sampling (with reference to their mineralogy and exploration significance), remote sensing techniques (subdivided into satellite imagery and aerial photography), geophysical techniques (including the magnetic-, gravity-, electrical-, radiometric- and seismic methods as well as heat flow models), geochemical techniques, petrographic- and electron beam techniques as well as geobotanical- and geobiological techniques. Finally, a brief summary of current evaluation techniques employed on diamondiferous kimberlite deposits is presented. The review covers kimberlite sampling methods, sample processing, diamond grade distributions (with reference to the experimental variogram model, statistical methods used in grade distribution calculations as well as block definition and local grade estimation). Stone size distributions, including microdiamond counts and value estimation, are also discussed.
KMBT_363
Adobe Acrobat 9.54 Paper Capture Plug-in
APA, Harvard, Vancouver, ISO, and other styles
6

Mahmood, Shahid. "A Systematic Review of Automated Test Data Generation Techniques." Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4349.

Full text
Abstract:
Automated Test Data Generation (ATDG) is an activity that in the course of software testing automatically generates test data for the software under test (SUT). It usually makes the testing more efficient and cost effective. Test Data Generation (TDG) is crucial for software testing because test data is one of the key factors for determining the quality of any software test during its execution. The multi-phased activity of ATDG involves various techniques for each of its phases. This research field is not new by any means, albeit lately new techniques have been devised and a gradual increase in the level of maturity has brought some diversified trends into it. To this end several ATDG techniques are available, but emerging trends in computing have raised the necessity to summarize and assess the current status of this area particularly for practitioners, future researchers and students. Further, analysis of the ATDG techniques becomes even more important when Miller et al. [4] highlight the hardship in general acceptance of these techniques. Under this scenario only a systematic review can address the issues because systematic reviews provide evaluation and interpretation of all available research relevant to a particular research question, topic area, or phenomenon of interest. This thesis, by using a trustworthy, rigorous, and auditable methodology, provides a systematic review that is aimed at presenting a fair evaluation of research concerning ATDG techniques of the period 1997-2006. Moreover it also aims at identifying probable gaps in research about ATDG techniques of defined period so as to suggest the scope for further research. This systematic review is basically presented on the pattern of [5 and 8] and follows the techniques suggested by [1].The articles published in journals and conference proceedings during the defined period are of concern in this review. The motive behind this selection is quite logical in the sense that the techniques that are discussed in literature of this period might reflect their suitability for the prevailing software environment of today and are believed to fulfill the needs of foreseeable future. Furthermore only automated and/or semiautomated ATDG techniques have been chosen for consideration while leaving the manual techniques as they are out of the scope. As a result of the preliminary study the review identifies ATDG techniques and relevant articles of the defined period whereas the detailed study evaluates and interprets all available research relevant to ATDG techniques. For interpretation and elaboration of the discovered ATDG techniques a novel approach called ‘Natural Clustering’ is introduced. To accomplish the task of systematic review a comprehensive research method has been developed. Then on the practical implications of this research method important results have been gained. These results have been presented in statistical/numeric, diagrammatic, and descriptive forms. Additionally the thesis also introduces various criterions for classification of the discovered ATDG techniques and presents a comprehensive analysis of the results of these techniques. Some interesting facts have also been highlighted during the course of discussion. Finally, the discussion culminates with inferences and recommendations which emanate from this analysis. As the research work produced in the thesis is based on a rich amount of trustworthy information, therefore, it could also serve the purpose of being an upto- date guide about ATDG techniques.
Shahid Mahmood Folkparksvägen 14:23 372 40 Ronneby Sweden +46 76 2971676
APA, Harvard, Vancouver, ISO, and other styles
7

Ni, Weizeng. "A Review and Comparative Study on Univariate Feature Selection Techniques." University of Cincinnati / OhioLINK, 2012. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1353156184.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Sundström, Stina. "Coding in Multiple Regression Analysis: A Review of Popular Coding Techniques." Thesis, Uppsala University, Department of Mathematics, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-126614.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Shabbir, Kashif, and Muhammad Amar. "Systematic Review on Testing Aspect-orientedPrograms : Challenges, Techniques and Their Effectiveness." Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3109.

Full text
Abstract:
Aspect-oriented programming is a relatively new programming paradigm and it builds on the basis of object oriented programming paradigm. It deals with those concerns that cross-cut the modularity of traditional programming mechanisms and it aims at reduction of code and to provide higher cohesion. As with any new technology aspect oriented programming provides some benefits and also there are some costs associated with it. In this thesis we have done a systematic review on aspect oriented software testing in the context of testing challenges. Detailed analysis have been made to show that how effective are the structural test techniques to handle these challenges. We have given the analysis of Aspect-oriented test techniques effectiveness, based on research literature.
Aspekt-orienterad programmering är ett relativt nytt programmering paradigm och det bygger på grundval av objektorienterad programmering paradigm. Det handlar om de farhågor som KORSSKUREN den modularitet av traditionell programmering mekanismer och det syftar till minskning av kod och för att ge högre sammanhållning. Som med all ny teknik aspekt-orienterad programmering ger vissa fördelar och det finns vissa kostnader associerade med den. I denna avhandling har vi gjort en systematisk översyn av aspekt orienterad mjukvara testning i samband med provning utmaningar. Detaljerad analys har gjorts för att visa att hur effektiva är de strukturella provmetoder för hantera dessa utmaningar. Vi har gett en analys av Aspect-oriented testa tekniker effektivitet, baserade på forskningslitteratur.
FOLKPARKSVAGEN 18 Room 03 Ronneby 37240 Sweden Mobile Number Kashif 073-9124604, Amar 073-6574048
APA, Harvard, Vancouver, ISO, and other styles
10

Khan, Kashif. "A Systematic Review of Software Requirements Prioritization." Thesis, Blekinge Tekniska Högskola, Avdelningen för programvarusystem, 2006. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4779.

Full text
Abstract:
Software engineering research has been, and still is criticised as being immature and unscientific due to lack of evaluation. However, software engineering community is now focusing more on empirical research and there is a movement to adopt approaches from other mature fields like medical science and one such approach is Systematic Reviews. One of the major activities within the requirements engineering process is to use requirements prioritization that helps to focus on the most important requirements. There are many prioritization techniques available to prioritize software requirements; still there is lack of evidence of which technique to prefer. The reasons could be the differences in contexts, measurement of variables and usage of data sets. In this thesis, the area of requirements prioritization has been systematically reviewed in order to assess what evidence regarding different prioritisation techniques exist. The results from different studies are contradictory in nature due to variations in study designs, research methodologies and choice of different dependent and context variables. Based on the results of the systematic review, a research framework has been proposed to provide the researchers with a common background for further research with in requirements prioritization area. The goal of the framework is to develop reliable knowledge base as well as help researchers conduct and report prioritization studies.
APA, Harvard, Vancouver, ISO, and other styles
11

Rashdan, Adam. "Requirement prioritization in Software Engineering : A Systematic Literature Review on Techniques and Methods." Thesis, Linnéuniversitetet, Institutionen för datavetenskap och medieteknik (DM), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-105747.

Full text
Abstract:
The present study provides a systematic overview of the most important software requirement prioritization techniques. Software requirement prioritization is a process in software engineering that precludes the actual development of software application programs and deals with assigning priorities to single requirements to define the order of their implementation. The study aims to help researchers and practitioners in deciding about the right technique since each has its advantages and limitations. Compared to the existing reviews, the current one not only captures the most promising techniques but the more general trends behind them. The study utilizes the review protocol method that aims to answer four research questions about the most popular techniques, their taxonomy, their limitations, and involved processes. The empirical data was collected from six databases for scientific manuscripts and put under scrutiny to identify the most relevant and elaborated papers. The results from 53 selected manuscripts and 106 discovered techniques demonstrate that there is evidence of a gradual shift from purely manual elicitation methods to computed-assisted and/or algorithmic ones.
APA, Harvard, Vancouver, ISO, and other styles
12

Tolley, Rebecca. "Review of Using Authentic Assessment in Information Literacy Programs: Tools, Techniques, Strategies." Digital Commons @ East Tennessee State University, 2019. https://dc.etsu.edu/etsu-works/5617.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Whitner, Richard B. "A taxonomical review of software verification techniques: an illustration using discrete-event simulation." Thesis, Virginia Tech, 1988. http://hdl.handle.net/10919/45931.

Full text
Abstract:

The use of simulation and modeling as a technique for solving today's complex problems is ever-increasing. Correspondingly, the demands placed on the software which serves as a computer-executable representation of the simulation model are increasing. With the increased complexity of simulation models comes a greater need for model verification, particularly programmed model verification. Unfortunately, current model verification technology is lacking in techniques which satisfy the verification needs. Specifically, there are few guidelines for performing programmed model verification. There is, however, an abundance of software verification techniques which are applicable for simulation model verification. An extensive review of techniques applicable for simulation programmed model verification is presented using the simulation and modeling terminology. A taxonomy for programmed model verification methods is developed. The usefulness of the taxonomy is twofold: (1) the taxonomy provides an approach for applying software verification techniques to the problem of programmed model verification, and (2) the breadth of the taxonomy provides a broad spectrum of perspectives from which to assess the credibility of simulation results. A simulation case study demonstrates the usefulness of the taxonomy and some of the verification techniques.

By referencing this work, one can determine what, and when, techniques can be used throughout the development life cycle. He will know how to perform each technique, how difficult each will be to apply, and how effective the technique will be. The simulation modeler - as well as the software engineer â will find the taxonomy and techniques valuable tools for guiding verification efforts.


Master of Science
APA, Harvard, Vancouver, ISO, and other styles
14

Yan, Guoning, and Wenkai Zhan. "Testing of mobile applications. A review of industry practices." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17880.

Full text
Abstract:
Context. With the rapid development of mobile devices, mobile applications have occupied an increasingly large market. Software testing is an important factor in ensuring software quality [59]. In order to develop high-quality mobile applications, software companies are increasingly paying attention to mobile application testing. We found that there are relatively few studies on mobile application testing methods and tools in the real industry, and this direction of research has been very helpful for some software companies' mobile application testing, so we chose this direction for exploratory research. Objectives. The purpose of our research is to investigate the software company's field of mobile application testing. We mainly study the methods and tools of mobile application testing that are now popular in enterprises, the challenges faced by mobile application testing in software companies and how they solve challenges. Method. We used two methods to answer our research questions. We chose literature review and survey as research methods. The literature review method gives us an insight into the areas of expertise and mobile testing. Survey is used to answer our research questions. Our data collection method is the questionnaire survey method. When we completed the data collection, we performed statistics and analysis on the data and answered our research questions. Result. Through the literature review method, we summarize the methods and tools for mobile application testing in the literature. By sending 46 surveys to different software companies, we analyzed and analyzed the methods and tools for mobile application testing that are popular in the enterprise and the reasons for choosing these methods. We also listed the company's mobile application testing. Challenges and solutions encountered. Conclusions. We answered all the research questions. We analyzed the feedback from 46 questionnaires and got information about mobile application testing. We read a lot of literature and summarized the relevant conclusions in the literature. Our findings can help people in related fields, and we can provide them with 46 data and analysis results. We can also provide mobile application methods and tools that are popular among readers in the enterprise.
APA, Harvard, Vancouver, ISO, and other styles
15

Soriano, Maria Lynn. "Student-Consultant Continuum: Incorporating Writing Center Techniques of Peer Review Into the Composition Classroom." John Carroll University / OhioLINK, 2010. http://rave.ohiolink.edu/etdc/view?acc_num=jcu1288706104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Dadpouri, Mohammad, and Kiran Nunna. "A Literature Review on Risk Analysis of Production Location Decisions." Thesis, Tekniska Högskolan, Högskolan i Jönköping, JTH, Industriell organisation och produktion, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hj:diva-17906.

Full text
Abstract:
This report is the result of a master thesis with a focus on risk analysis of production location decisions. The project is a part of “PROLOC-manufacturing footprint during the product’s life cycle”. The main aim of this thesis is to point out how current applicable risk analysis techniques evaluate the risks involved in production location decisions and then underline the most important risks involved in production location decisions and elicit strengths and weaknesses of these methods.A systematic review of literature with a focus on journal papers of risk analysis and production fields is conducted by using the content analysis and coding technique. The current risk analysis techniques identified are failure mode and effects analysis (FMEA), life cycle cost (LCC) analysis, and system based techniques like multiobjective analysis, decision tree analysis, and analytic hierarchy process (AHP). In addition two identified frameworks of foreign direct investment (FDI) and international production are the research fields that have contributed extensively in identifying various risks of production location decisions.Having reviewed the literature, it is realized that majority of companies take a short sighted vision in choosing production location and consider just cost based issues like cheaper raw material and low labour cost in some countries and simply ignore uncertainties that can be sources of political, economic, social, competitive, and seismic risks. Low cost countries are usually situated in politically instable areas that can cause long production halts or expropriation. Political risk is mainly identified in FDI literature and is usually triggered by a political turmoil, coup d’état, or revolution. On the other hand cheap labour does not necessarily mean decrease in costs and might bring about quality issues and damage company prestige among customers which results in time and monetary loss. Currency exchange and inflation in costs often causes the initial forecast and cost analysis go wrong. Supply risks are because of disruption of ties with raw material or part suppliers in home country and might result in risk of misuse by new suppliers or partners. Also the seismic risk is introduced as a separate category of risks of production location decisions which can be considered a matter of more investigation and requires further research.The study also presents a review of strengths and weaknesses of existing risk analysis techniques of production location decisions. The lack of consistency, vagueness of information, unfamiliarity with design to cost concept are among the major weaknesses of risk analysis techniques of production location decisions. The study concludes with the fact that just considering the cost oriented factors like cheap labour and raw material by production companies exposed them to various risk and might make the whole investment in vain. Suggestions for further study on techniques and risks of production location decisions are also proposed.
APA, Harvard, Vancouver, ISO, and other styles
17

Botha, Barend HJ. "Systematic review: Availability, effectiveness and safety of assisted reproductive techniques in Sub-Saharan Africa." Master's thesis, University of Cape Town, 2018. http://hdl.handle.net/11427/29315.

Full text
Abstract:
STUDY QUESTION: What is the evidence pertaining to availability, effectiveness and safety of assisted reproductive technology (ART) in sub-Saharan Africa? SUMMARY ANSWER: According to overall limited and heterogeneous evidence, availability and utilization of ART are very low, clinical pregnancy rates largely compare to other regions but are accompanied by high multiple pregnancy rates, and in the near absence of data on deliveries and live births the true degree of effectiveness and safety remains to be established. WHAT IS KNOWN ALREADY: In most world regions, availability, utilization and outcomes of ART are monitored and reported by national and regional ART registries. In sub-Saharan Africa there is only one national and no regional registry to date, raising the question what other evidence exists documenting the status of ART in this region. STUDY DESIGN, SIZE, DURATION: A systematic review was conducted searching PUBMED, SCOPUS, AFRICAWIDE, WEB OF SCIENCE and CINAHL databases from January 2000 to June 2017. A total of 29 studies were included in the review. The extracted data were not suitable for meta-analysis. PARTICIPANTS/MATERIALS, SETTING, METHOD: The review was conducted according to PRISMA guidelines. All peer-reviewed manuscripts irrespective of language or study design that presented original data pertaining to availability, effectiveness and safety of ART in sub-Saharan Africa were eligible for inclusion. Selection criteria were specified prior to the search. Two authors independently reviewed studies for possible inclusion and critically appraised selected manuscripts. Data were analyzed descriptively, being unsuitable for statistical analysis. MAIN RESULTS AND THE ROLE OF CHANCE: The search yielded 810 references of which 29 were included based on the predefined selection and eligibility criteria. Extracted data came from 23 single centre observational studies, 2 global ART reports, 2 reviews, 1 national data registry and 1 community-based study. ART services were available in 10 countries and delivered by 80 centres in 6 of these. Data pertaining to number of procedures existed from 3 countries totalling 4619 fresh non-donor aspirations in 2010. The most prominent barrier to access was cost. Clinical pregnancy rates ranged between 21.2% to 43.9% per embryo transfer but information on deliveries and live births were lacking, seriously limiting evaluation of ART effectiveness. When documented, the rate of multiple pregnancy was high with information on outcomes similarly lacking. LIMITATIONS, REASONS FOR CAUTION: The findings in this review are based on limited data from a limited number of countries, and are derived from heterogeneous studies, both in terms of study design and quality, many of which include small sample sizes. Although representing best available evidence, this requires careful interpretation regarding the degree of representativeness of the current status of ART in sub-Saharan Africa. WIDER IMPLICATIONS OF THE FINDINGS: The true extent and outcome of ART in sub-Saharan Africa could not be reliably documented as the relevant information was not available. Current efforts are underway to establish a regional ART data registry in order to report and monitor availability, effectiveness and safety of ART thus contributing to evidence-based practice and possible development strategies. STUDY FUNDING/COMPETING INTEREST(S): No funding was received for this study. The authors had no competing interests. TRIAL REGISTRATION NUMBER: PROSPERO CRD42016032336
APA, Harvard, Vancouver, ISO, and other styles
18

Podray, Susan. "Current Technology and Techniques in Re-mineralization of White Spot Lesions: A Systematic Review." Master's thesis, Temple University Libraries, 2012. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/170366.

Full text
Abstract:
Oral Biology
M.S.
White Spot lesions are a common iatrogenic occurrence on patients who are treated with fixed orthodontic appliances. There is a dynamic chemical interaction between enamel and saliva at the tooth surface that allow a lesion to have phase changes involving demineralization of enamel and remineralization. This is due to calcium and phosphate dissolved in saliva that is deposited onto the tooth surface or removed depending on the surrounding pH. Caseinphosphopeptide-amorphous calcium phosphate (CPP-ACP) is gaining popularity in dentistry as a way to increase the available level of calcium and phosphate in plaque and saliva to improve the chemical gradient so that if favors remineralization. The aim of our investigation is to search the available current literature and formulate a recommendation for use of CPP-ACP in orthodontics. Publications from the following electronic databases were searched: PubMed, Web of Science, Cochrane Library and Science Direct. Searches from August 2010 to April 1st 2012 were performed under the terms "MI Paste OR Recaldent OR caseinphosphopeptide-amorphous calcium phosphate OR CPP-ACP or tooth mousse". The searches yielded 155 articles, These were reviewed for relevance based on inclusion and exclusion criteria. Articles with inappropriate study design or no outcome measures at both baseline and end point were also excluded. 13 articles were deemed of relevance with a high quality study design and were included in this study for evaluation. The current literature suggests a preventative treatment regimen in which MI Paste Plus is used. It should be delivered once daily prior to bed after oral hygiene for 3 minutes in a fluoride tray, throughout orthodontic treatment. It should be recommended for high risk patients determined by poor oral hygiene, as seen by the inability to remove plaque from teeth and appliances. This protocol may prevent or assist in the remineralization of enamel white spot lesions during and after orthodontic treatment.
Temple University--Theses
APA, Harvard, Vancouver, ISO, and other styles
19

Eckert, Nathan James. "Review of Railgun Modeling Techniques| The Computation of Railgun Force and Other Key Factors." Thesis, University of Colorado at Boulder, 2017. http://pqdtopen.proquest.com/#viewpdf?dispub=10619188.

Full text
Abstract:

Currently, railgun force modeling either uses the simple “railgun force equation” or finite element methods. It is proposed here that a middle ground exists that does not require the solution of partial differential equations, is more readily implemented than finite element methods, and is more accurate than the traditional force equation. To develop this method, it is necessary to examine the core railgun factors: power supply mechanisms, the distribution of current in the rails and in the projectile which slides between them (called the armature), the magnetic field created by the current flowing through these rails, the inductance gradient (a key factor in simplifying railgun analysis, referred to as L'), the resultant Lorentz force, and the heating which accompanies this action. Common power supply technologies are investigated, and the shape of their current pulses are modeled. The main causes of current concentration are described, and a rudimentary method for computing current distribution in solid rails and a rectangular armature is shown to have promising accuracy with respect to outside finite element results. The magnetic field is modeled with two methods using the Biot-Savart law, and generally good agreement is obtained with respect to finite element methods (5.8% error on average). To get this agreement, a factor of 2 is added to the original formulation after seeing a reliable offset with FEM results. Three inductance gradient calculations are assessed, and though all agree with FEM results, the Kerrisk method and a regression analysis method developed by Murugan et al. (referred to as the LRM here) perform the best. Six railgun force computation methods are investigated, including the traditional railgun force equation, an equation produced by Waindok and Piekielny, and four methods inspired by the work of Xu et al. Overall, good agreement between the models and outside data is found, but each model’s accuracy varies significantly between comparisons. Lastly, an approximation of the temperature profile in railgun rails originally presented by McCorkle and Bahder is replicated. In total, this work describes railgun technology and moderately complex railgun modeling methods, but is inconclusive about the presence of a middle-ground modeling method.

APA, Harvard, Vancouver, ISO, and other styles
20

Liebchen, Gernot Armin. "Data cleaning techniques for software engineering data sets." Thesis, Brunel University, 2010. http://bura.brunel.ac.uk/handle/2438/5951.

Full text
Abstract:
Data quality is an important issue which has been addressed and recognised in research communities such as data warehousing, data mining and information systems. It has been agreed that poor data quality will impact the quality of results of analyses and that it will therefore impact on decisions made on the basis of these results. Empirical software engineering has neglected the issue of data quality to some extent. This fact poses the question of how researchers in empirical software engineering can trust their results without addressing the quality of the analysed data. One widely accepted definition for data quality describes it as `fitness for purpose', and the issue of poor data quality can be addressed by either introducing preventative measures or by applying means to cope with data quality issues. The research presented in this thesis addresses the latter with the special focus on noise handling. Three noise handling techniques, which utilise decision trees, are proposed for application to software engineering data sets. Each technique represents a noise handling approach: robust filtering, where training and test sets are the same; predictive filtering, where training and test sets are different; and filtering and polish, where noisy instances are corrected. The techniques were first evaluated in two different investigations by applying them to a large real world software engineering data set. In the first investigation the techniques' ability to improve predictive accuracy in differing noise levels was tested. All three techniques improved predictive accuracy in comparison to the do-nothing approach. The filtering and polish was the most successful technique in improving predictive accuracy. The second investigation utilising the large real world software engineering data set tested the techniques' ability to identify instances with implausible values. These instances were flagged for the purpose of evaluation before applying the three techniques. Robust filtering and predictive filtering decreased the number of instances with implausible values, but substantially decreased the size of the data set too. The filtering and polish technique actually increased the number of implausible values, but it did not reduce the size of the data set. Since the data set contained historical software project data, it was not possible to know the real extent of noise detected. This led to the production of simulated software engineering data sets, which were modelled on the real data set used in the previous evaluations to ensure domain specific characteristics. These simulated versions of the data set were then injected with noise, such that the real extent of the noise was known. After the noise injection the three noise handling techniques were applied to allow evaluation. This procedure of simulating software engineering data sets combined the incorporation of domain specific characteristics of the real world with the control over the simulated data. This is seen as a special strength of this evaluation approach. The results of the evaluation of the simulation showed that none of the techniques performed well. Robust filtering and filtering and polish performed very poorly, and based on the results of this evaluation they would not be recommended for the task of noise reduction. The predictive filtering technique was the best performing technique in this evaluation, but it did not perform significantly well either. An exhaustive systematic literature review has been carried out investigating to what extent the empirical software engineering community has considered data quality. The findings showed that the issue of data quality has been largely neglected by the empirical software engineering community. The work in this thesis highlights an important gap in empirical software engineering. It provided clarification and distinctions of the terms noise and outliers. Noise and outliers are overlapping, but they are fundamentally different. Since noise and outliers are often treated the same in noise handling techniques, a clarification of the two terms was necessary. To investigate the capabilities of noise handling techniques a single investigation was deemed as insufficient. The reasons for this are that the distinction between noise and outliers is not trivial, and that the investigated noise cleaning techniques are derived from traditional noise handling techniques where noise and outliers are combined. Therefore three investigations were undertaken to assess the effectiveness of the three presented noise handling techniques. Each investigation should be seen as a part of a multi-pronged approach. This thesis also highlights possible shortcomings of current automated noise handling techniques. The poor performance of the three techniques led to the conclusion that noise handling should be integrated into a data cleaning process where the input of domain knowledge and the replicability of the data cleaning process are ensured.
APA, Harvard, Vancouver, ISO, and other styles
21

Nakhaeezadeh, Gutierrez Aydin. "Review of Observation and SystemIdentification Techniques in a VerifiedModel of a Satellite with Flexible Panels." Thesis, Luleå tekniska universitet, Rymdteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:ltu:diva-81772.

Full text
Abstract:
The demand of space applications has been increasing over the years. This has derivedin new satellites structures that required from precise and robust control management.The satellite design is evolving towards the development of lighter structures. The combinationof lighter structures with precise and robust control has arisen the problem ofstructure vibration control. The control design of structures with large appendages likeantennas, booms or solar panels has become a challenge. The flexible dynamics of theappendages needs to be considered when performing the attitude analysis of the satellite,since these parts can be easily excited by the environment perturbations such us gravity,gravity gradient or solar wind. The objective of these research project is to develop ahigh-fidelity model plant of a satellite with flexible panels and review different systemidentification techniques used to observe the states of the system. The equations of themodel are reviewed and the model is verified against a multi-body software, Adams. Thesensors and actuators are selected and modelled for the control of the rigid body and theobservation of the rigid and flexible body. For the implementation of the flexible structureobservations a technique based in Genetic Algorithm is applied for optimal sensor location.Finally, different system identification techniques are reviewed for the identificationof modal parameters and rigid body parameters. The results illustrate the performanceof the model and how the different system identification techniques are performed whenobserving the model states.
APA, Harvard, Vancouver, ISO, and other styles
22

Dix-Peek, Stewart. "Pelvic osteotomies for exstrophy : a review of techniques and outcomes at Red Cross Children's Hospital." Master's thesis, University of Cape Town, 2007. http://hdl.handle.net/11427/2838.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Obidallah, Waeal. "Multi-Layer Web Services Discovery using Word Embedding and Clustering Techniques." Thesis, Université d'Ottawa / University of Ottawa, 2021. http://hdl.handle.net/10393/41840.

Full text
Abstract:
Web services discovery is the process of finding the right Web services that best match the end-users’ functional and non-functional requirements. Artificial intelligence, natural language processing, data mining, and text mining techniques have been applied by researchers in Web services discovery to facilitate the process of matchmaking. This thesis contributes to the area of Web services discovery and recommendation, adopting the Design Science Research Methodology to guide the development of useful knowledge, including design theory and artifacts. The lack of a comprehensive review of Web services discovery and recommendation in the literature motivated us to conduct a systematic literature review. Our main purpose in conducting the systematic literature review was to identify and systematically compare current clustering and association rules techniques for Web services discovery and recommendation by providing answers to various research questions, investigating the prior knowledge, and identifying gaps in the related literature. We then propose a conceptual model and a typology of Web services discovery systems. The conceptual model provides a high-level representation of Web services discovery systems, including their various elements, tasks, and relationships. The proposed typology of Web services discovery systems is composed of five groups of characteristics: storage and location characteristics, formalization characteristics, matchmaking characteristics, automation characteristics, and selection characteristics. We reference the typology to compare Web services discovery methods and architectures from the extant literature by linking them to the five proposed characteristics. We employ the proposed conceptual model with its specified characteristics to design and develop the multi-layer data mining architecture for Web services discovery using word embedding and clustering techniques. The proposed architecture consists of five layers: Web services description and data preprocessing; word embedding and representation; syntactic similarity; semantic similarity; and clustering. In the first layer, we identify the steps to parse and preprocess the Web services documents. Bag of Words with Term Frequency–Inverse Document Frequency and three word-embedding models are employed for Web services representation in the second layer. Then in the third layer, four distance measures, including Cosine, Euclidean, Minkowski, and Word Mover, are studied to find the similarities between Web services documents. In layer four, WordNet and Normalized Google Distance are employed to represent and find the similarity between Web services documents. Finally, in the fifth layer, three clustering algorithms, including affinity propagation, K-means, and hierarchical agglomerative clustering, are investigated to cluster Web services based on the observed documents’ similarities. We demonstrate how each component of the five layers is employed in the process of Web services clustering using random-ly selected Web services documents. We conduct experimental analysis to cluster Web services using a collected dataset of Web services documents and evaluating their clustering performances. Using a ground truth for evaluation purposes, we observe that clusters built based on the word embedding models performed better compared to those built using the Bag of Words with Term Frequency–Inverse Document Frequency model. Among the three word embedding models, the pre-trained Word2Vec’s skip-gram model reported higher performance in clustering Web services. Among the three semantic similarity measures, path-based WordNet similarity reported higher clustering performance. By considering the different words representations models and syntactic and semantic similarity measures, the affinity propagation clustering technique performed better in discovering similarities among Web services.
APA, Harvard, Vancouver, ISO, and other styles
24

shafiq, Hafiz Adnan, and Zaki Arshad. "Automated Debugging and Bug Fixing Solutions : A Systematic Literature Review and Classification." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3105.

Full text
Abstract:
Context: Bug fixing is the process of ensuring correct source code and is done by developer. Automated debugging and bug fixing solutions minimize human intervention and hence minimize the chance of producing new bugs in the corrected program. Scope and Objectives: In this study we performed a detailed systematic literature review. The scope of work is to identify all those solutions that correct software automatically or semi-automatically. Solutions for automatic correction of software do not need human intervention while semi-automatic solutions facilitate a developer in fixing a bug. We aim to gather all such solutions to fix bugs in design, i.e., code, UML design, algorithms and software architecture. Automated detection, isolation and localization of bug are not in our scope. Moreover, we are only concerned with software bugs and excluding hardware and networking domains. Methods: A detailed systematic literature review (SLR) has been performed. A number of bibliographic sources are searched, including Inspec, IEEE Xplore, ACM digital library, Scopus, Springer Link and Google Scholar. Inclusion/exclusion, study quality assessment, data extraction and synthesis have been performed in depth according to guidelines provided for performing SLR. Grounded theory is used to analyze literature data. To check agreement level between two researchers, Kappa analysis is used. Results: Through SLR we identified 46 techniques. These techniques are classified in automated/semi-automated debugging and bug fixing. Strengths and weaknesses of each of them are identified, along with which types of bugs each can fix and in which language they can be implement. In the end, classification is performed which generate a list of approaches, techniques, tools, frameworks, methods and systems. Along, this classification and categorization we separated bug fixing and debugging on the bases of search algorithms. Conclusion: In conclusion achieved results are all automated/semi-automated debugging and bug fixing solutions that are available in literature. The strengths/benefits and weaknesses/limitations of these solutions are identified. We also recognize type of bugs that can be fixed using these solutions. And those programming languages in which these solutions can be implemented are discovered as well. In the end a detail classification is performed.
alla automatiska / halvautomatiska felsökning och felrättning lösningar som är tillgängliga i litteraturen. De styrkor / fördelar och svagheter / begränsningar av dessa lösningar identifieras. Vi erkänner också typ av fel som kan fastställas med hjälp av dessa lösningar. Och de programmeringsspråk där dessa lösningar kan genomföras upptäcks också. Till slut en detalj klassificering utförs
+46 763 23 93 87, +46 70 966 09 51
APA, Harvard, Vancouver, ISO, and other styles
25

Ganna, Anil, and Sri Sai Ripughna Rishitosh Sonti. "Analysis of Requirements Volatility in Elicitation Process : A Systematic Literature Review & Survey." Thesis, Blekinge Tekniska Högskola, Institutionen för programvaruteknik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-19339.

Full text
Abstract:
Context: In the requirements engineering phase, requirements elicitation is considered as the most important task as it is the initial phase in which the requirements are gathered and prioritised. Changes in requirements may lead to project failure or delay in project deliveries. So, it is essential to elicit the requirements at the early stage to avoid changes in requirements in the later stage of development. Therefore, there is a need to study the impact of volatility in elicitation techniques to gather requirements appropriately in the early stages. Objectives: In the present thesis, we focused on the analysis of the requirements volatility in the requirement elicitation phase. The main objectives we have formulated to achieve our goal are Objective 1: To identify and determine the various causes of requirement volatility. Objective 2: To examine the impact of requirement volatility in the requirement elicitation process. Objective 3: To examine whether the procedure of elicitation techniques differ if volatility occurs while eliciting the requirements. Methods: In this thesis, we have implemented a Systematic Literature Review(SLR) and Survey research methods in order to attain our aim and objectives. SLR is performed for objective 1, to receive the data about the causes of volatility in various development life cycle phases. A survey is conducted to identify the causes of volatility in all phases of development, in the elicitation phase, and check whether the process of elicitation techniques differ if volatility occurs while eliciting the requirements. Results: From the SLR and survey, numerous factors of causes of volatility on the software development lifecycle were identified. Several new factors were identified from both the research methods. The factors have its own interpretation for the cause of volatility. Moreover, from the survey results, we can determine that the volatility occurs in the elicitation phase and has a huge impact while eliciting the requirements. Most of the practitioners working with the agile development process and waterfall model have stated that the impact of volatility results in prolonging the elicitation phase, slowing down the project, etc. Conclusions: For this research, our contribution is to provide insights on the impact of volatility in the elicitation process and check whether the elicitation techniques and its process change due to volatility. Based on the results of the respondents, we can conclude that the elicitation techniques procedure change is not intentional and not only because of the volatility but also due to some external factors while eliciting the requirements.
APA, Harvard, Vancouver, ISO, and other styles
26

Dória, Emerson Silas. "Replicação de estudos empíricos em engenharia de software." Universidade de São Paulo, 2001. http://www.teses.usp.br/teses/disponiveis/55/55134/tde-10052002-103851/.

Full text
Abstract:
A crescente utilização de sistemas baseados em computação em praticamente todas as áreas da atividade humana provoca uma crescente demanda por qualidade e produtividade, tanto do ponto de vista do processo de produção como do ponto de vista dos produtos de software gerados. Nessa perspectiva, atividades agregadas sob o nome de Garantia de Qualidade de Software têm sido introduzidas ao longo de todo o processo de desenvolvimento de software. Dentre essas atividades destacam-se as atividades de Teste e Revisão, ambas com o objetivo principal de minimizar a introdução de erros durante o processo de desenvolvimento nos produtos de software gerados. A atividade de Teste constitui um dos elementos para fornecer evidências da confiabilidade do software em complemento a outras atividades, como por exemplo, o uso de revisões e de técnicas formais e rigorosas de especificação e de verificação. A atividade de Revisão, por sua vez, é um 'filtro' eficiente para o processo de engenharia de software, pois favorece a identificação e a eliminação de erros antes do passo seguinte do processo de desenvolvimento. Atualmente, pesquisas estão sendo realizadas com objetivo de determinar qual técnica, Revisão ou Teste, é mais adequada e efetiva, em determinadas circunstâncias, para descobrir determinadas classes de erros; e de forma mais ampla, como as técnicas podem ser aplicadas de forma complementar para melhoria da qualidade de software. Ainda que a atividade de teste seja indispensável no processo de desenvolvimento, investigar o aspecto complementar dessas técnicas é de grande interesse, pois em muitas situações tem-se observado que as revisões são tão ou mais efetivas quanto os testes. Nessa perspectiva, este trabalho tem como objetivo realizar um estudo comparativo, por meio da replicação de experimentos, entre Técnicas de Teste e Técnicas de Revisão no que se refere à detecção de erros em produtos de software (código fonte e documento de especificação de requisitos). Para realizar esse estudo são utilizados critérios de teste das técnicas funcional (particionamento em classes de equivalência e análise do valor limite), estrutural (todos-nós, todos-arcos, todos-usos, todos-potenciais-usos), baseada em erros (análise de mutantes), bem como, técnicas de leitura (stepwise abstraction e perspective based reading) e técnicas de inspeção (ad hoc e checklist). Além de comparar a efetividade e a eficiência das técnicas em detectar erros em produtos de software, este trabalho objetivo ainda utilizar os conhecimentos específicos relacionados a critérios de teste para reavaliar as técnicas utilizadas nos experimentos de Basili & Selby, Kamsties & Lott e Basili.
The increasing use of computer based systems in practically all human activity areas provokes higher demand for quality and productivity, from the point of view of software process as well as from the point of view of software products. In this perspective, activities aggregated under the name of Software Quality Assurance have been introduced throughout the software development process. Amongst these activities, the test and review activities are distinguished, both of them aiming at minimizing the introduction of errors during the development process. The test activity constitutes one of the elements to supply evidences of software reliability as a complement to other activities, for example, the use of review and formal, rigorous techniques for specification and verification. The review activity, in turn, is an efficient 'filter' for the process of software engineering, therefore it favors the identification of errors before the next step of the development process. Currently, researches have been carried out with the objective of determining which technique, review or test, is more appropriate and effective, in certain circumstances, to discover some classes of errors, and mostly, how the techniques can be applied in complement to each other for improvement of software quality. Even if the test activity is indispensable in the development process, investigating the complementary aspect of these techniques is of great interest, for in many situations it has been observed that reviews are as or more effective as test. In this perspective, this work aims at accomplishing a comparative study, through the replication of experiments, between Testing Techniques and Reviews concerning error detection in software products at the source code and requirement specification level. To carry out this study are used testing criteria of the techniques: functional (equivalence partitioning and boundary value analysis); structural (all-nodes, all-edges, all-uses, all-potential-uses); error based (mutation testing), as well as reading techniques (stepwise abstraction and perspective based reading) and inspection techniques (ad hoc e checklist). Besides comparing the effectiveness and efficiency of the techniques in detecting errors in software products, this work also aims at reevaluating and eventually at improving the techniques used in experiment of Basili & Selby, Kamsties & Lott and Basili.
APA, Harvard, Vancouver, ISO, and other styles
27

Ehlers, PJ, CG Richards, DV Nicolae, E. Monacelli, and Y. Hamam. "Review of the state of the Art of modulation techniques and control strategies for matrix converters." International Review of Automatic Control, 2008. http://encore.tut.ac.za/iii/cpro/DigitalItemViewPage.external?sp=1001151.

Full text
Abstract:
The reliability and stability of the Matrix Converter has improved during the last years due to the enhanced control algorithms. The traditional direct transfer function control mode has been replaced by more complex – digitally implemented control methodologies. These methodologies allow for real time calculation of the optimal switching interval of each individual switch of the matrix converter. These new switching algorithms allow optimal performances, ensuring sinusoidal outputs at any desired power factor. This paper will first revise the underlying theory of matrix converters, then review the various control limitations and finally review the current control algorithms.
APA, Harvard, Vancouver, ISO, and other styles
28

Birch, David. "A review of vibration signal processing techniques for use in a real time condition monitoring system." Master's thesis, University of Cape Town, 1994. http://hdl.handle.net/11427/9641.

Full text
Abstract:
Bibliography: p. 181-183.
The analysis of the vibrations produced by roller bearings is one of the most widely used techniques in condition determination of rolling element bearings. This project forms part of an overall plan to gain experience in condition monitoring and produce a computer aided vibration monitoring system that would initially be applied to rolling element bearings, and then later to other machine components. The particular goal of this project is to study signal processing techniques that will be of use in this system. The general signal processing problems are as follows. The vibration of an undamaged bearing is characterised by a Gaussian distribution and a white power spectral density. Once a bearing is damaged the nature of the vibration changes often with spikes or impulses present in the vibration signal. By detecting these impulses a measure of the condition of the bearing may be obtained. The primary goal in machine condition determination then becomes the detection of these impulses in the presence of noise and contaminating. signals and to discriminate between those caused by the component in question and those from other sources. A wide range of signal processing techniques were reviewed and some of these tested on vibrations recorded on the Mechanical engineering departments bearing test rig. It was found that the time domain statistics (RMS, kurtosis, crest factor) were the simplest to use, but could be unreliable. On the other hand, frequency domain analysis techniques, such as the power spectrum were more reliable, but more difficult to apply. By making use of a variety of these techniques and applying them in a systematic manner, it is possible to make an assessment of bearing condition under a wide variety of operating conditions. A small number of the signal processing techniques were programmed for a DSP processor. It was found that all of the techniques, with the exception of the bispectrum could be programmed for the DSP chip. It was found however that the available DSP card did not have sufficient memory to allow analysis and preprocessing routines to be combined. In addition to this the analogue to digital conversion system would benefit from a buffered IO system. The project should continue, with the DSP card being upgraded and all the necessary signal processing routines programmed. The project can then move to the next phase which would be inclusion of display and interface software and Artificial Intelligence analysis aids.
APA, Harvard, Vancouver, ISO, and other styles
29

Moneta, Francesco Maria. "State of the art techniques for creating secure software within the Agile process: a systematic literature review." Bachelor's thesis, Alma Mater Studiorum - Università di Bologna, 2018. http://amslaurea.unibo.it/16803/.

Full text
Abstract:
Agile processes have become ubiquitous in the software development community, and are used by the majority of companies. At the same time, the need for secure and trustworthy software has been steadily growing. Agile software processes nonetheless have proven difficult to integrate with the preexisting security frameworks developed for the Waterfall processes. This thesis presents the results of a systematic literature review that investigates solutions to this problem. The research questions to which the researcher tried to answer are: "which are the latest solutions to enhance the security of the software developed using the Agile process??" and "Which of the solutions discussed have performed best pilot studies?". This study analyzed 39 papers published between 2011 and 2018. The results were ordered according to which exhibited the highest consensus and coded into four sets. The most salient suggestions were: increase the training of the developers, add dedicated security figures to the development team, hybridize security solution from the waterfall processes and add security artifacts such as the "security backlog" and "evil user stories" to Agile.
APA, Harvard, Vancouver, ISO, and other styles
30

Raak, Norbert, Raffaele Andrea Abbate, Albena Lederer, Harald Rohm, and Doris Jaros. "Size Separation Techniques for the Characterisation of Cross-Linked Casein: A Review of Methods and Their Applications." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2018. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-234862.

Full text
Abstract:
Casein is the major protein fraction in milk, and its cross-linking has been a topic of scientific interest for many years. Enzymatic cross-linking has huge potential to modify relevant techno-functional properties of casein, whereas non-enzymatic cross-linking occurs naturally during the storage and processing of milk and dairy products. Two size separation techniques were applied for characterisation of these reactions: gel electrophoresis and size exclusion chromatography. This review summarises their separation principles and discusses the outcome of studies on cross-linked casein from the last ~20 years. Both methods, however, show limitations concerning separation range and are applied mainly under denaturing and reducing conditions. In contrast, field flow fractionation has a broad separation range and can be easily applied under native conditions. Although this method has become a powerful tool in polymer and nanoparticle analysis and was used in few studies on casein micelles, it has not yet been applied to investigate cross-linked casein. Finally, the principles and requirements for absolute molar mass determination are reviewed, which will be of increased interest in the future since suitable calibration substances for casein polymers are scarce.
APA, Harvard, Vancouver, ISO, and other styles
31

Yusufoglu, Ayca. "A Critical Review Of The Tools And Techniques Used In Coastal Planning: Case Study Mugla-gokova Special Environmental Protection Area." Master's thesis, METU, 2010. http://etd.lib.metu.edu.tr/upload/12612354/index.pdf.

Full text
Abstract:
A CRITICAL REVIEW OF THE TOOLS &
TECHNIQUES USED IN COASTAL PLANNING: CASE STUDY MUGLA-GÖ
KOVA SPECIAL ENVIRONMENTAL PROTECTION AREA Yusufoglu, Ayç
a M. S., Programme of City Planning Supervisor: Assoc. Prof. Dr. Serap Kayasü
June 2010, 119 Pages This study focuses on the issue of coastal area management in terms of planning techniques and tools as well as legal aspects necessary in order to clarify the components of a successful coastal area planning process. It has been emphasized that planning of the coastal area should be performed within the context of integrated policy mechanism considering maintance of biodiversity, public participation and, promoting diversification among coastal related economic uses such as tourism, aquaculture, fishing. The thesis haS been grouped into according to definitions of coastal area, coastal planning and legislation, institutions, organizations, international commisions regarding coastal areas and Gö
kova Special Environment Protection Area from the perspective of Integrated Coastal Management (ICM). The case section of the thesis formed by five phases of Gö
kova SEPA 1/25.000 scaled Environmental Relation Plan in order to achieve ICM. Also, this is the first study performed by the reviewing of the tools and techniques used in the Gö
kova SEPA towards Integrated Coastal Management approach.
APA, Harvard, Vancouver, ISO, and other styles
32

Finne, Emily, Melanie Glausch, Anne-Kathrin Exner, Odile Sauzet, Friederike Stölzel, and Nadja Seidel. "Behavior change techniques for increasing physical activity in cancer survivors: a systematic review and meta-analysis of randomized controlled trials." Dove Medical Press, 2018. https://tud.qucosa.de/id/qucosa%3A33823.

Full text
Abstract:
Purpose: The purpose of this systematic review and meta-analysis is to investigate how physical activity (PA) can be effectively promoted in cancer survivors. The effect of PA-promoting interventions in general, behavior change techniques (BCTs), and further variables as moderators in particular are evaluated. Methods: This study included randomized controlled trials of lifestyle interventions aiming at an increase in PA that can be carried out independently at home, published by December 2016, for adults diagnosed with cancer after completion of the main treatment. Primary outcomes were subjective and objective measures of PA prior to and immediately after the intervention. Meta-analysis and meta-regression were used to estimate effect sizes (ES) in terms of standardized mean differences, variation between ES in terms of heterogeneity indices (I2), and moderator effects in terms of regression coefficients. Results: This study included 30 studies containing 45 ES with an overall significant small positive effect size of 0.28 (95% confidence interval=0.18–0.37) on PA, and I2=54.29%. The BCTs Prompts, Reduce prompts, Graded tasks, Non-specific reward, and Social reward were significantly related to larger effects, while Information about health consequences and Information about emotional consequences, as well as Social comparison were related to smaller ES. The number of BCTs per intervention did not predict PA effects. Interventions based on the Theory of Planned Behavior were associated with smaller ES, and interventions with a home-based setting component were associated with larger ES. Neither the duration of the intervention nor the methodological quality explained differences in ES. Conclusion: Certain BCTs were associated with an increase of PA in cancer survivors. Interventions relying on BCTs congruent with (social) learning theory such as using prompts and rewards could be especially successful in this target group. However, large parts of between-study heterogeneity in ES remained unexplained. Further primary studies should directly compare specific BCTs and their combinations.
APA, Harvard, Vancouver, ISO, and other styles
33

Mamun, Md Abdullah Al, and Aklima Khanam. "Concurrent Software Testing : A Systematic Review and an Evaluation of Static Analysis Tools." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4310.

Full text
Abstract:
Verification and validation is one of the most important concerns in the area of software engineering towards more reliable software development. Hence it is important to overcome the challenges of testing concurrent programs. The extensive use of concurrent systems warrants more attention to the concurrent software testing. For testing concurrent software, automatic tools development is getting increased focus. The first part of this study presents a systematic review that aims to explore the state-of-the-art of concurrent software testing. The systematic review reports several issues like concurrent software characteristics, bugs, testing techniques and tools, test case generation techniques and tools, and benchmarks developed for the tools. The second part presents the evaluation of four commercial and open source static analysis tools detecting Java multithreaded bugs. An empirical evaluation of the tools would help the industry as well as the academia to learn more about the effectiveness of the static analysis tools for concurrency bugs.
APA, Harvard, Vancouver, ISO, and other styles
34

Swan, Adrian Kenneth. "“Out with the old and in with the new” - A retrospective review of paediatric craniocervical junction fixation: indications, techniques and outcomes." Master's thesis, Faculty of Health Sciences, 2019. http://hdl.handle.net/11427/30853.

Full text
Abstract:
Background: The paediatric craniocervical junction has anatomical, physiological and biomechanical properties that make this region unique to that of the adult spine, vulnerable to injury, and contribute to the complexity of management. Traditionally, on-lay fusion with external Halo immobilisation has been used. Instrumented fusion offers intra-operative reduction and immediate stability. Methods: A retrospective review of a single surgeon’s prospectively maintained database was conducted for all cases of paediatric patients that had undergone a fusion involving the occipito-atlanto-axial region. Case notes were reviewed and a radiological analysis was done. Results: Sixteen patients were managed with on-lay fusion and external immobilisation and twentyseven patients were managed with internal fixation using screw-rod constructs. The fusion rates were 80% and 90.5% respectively. Allograft bone grafting was found to be a significant risk factor for non-union. Conclusion: The screws can be safely and predictably placed as confirmed on radiological follow-up with a high fusion rate and an acceptable complication rate. Uninstrumented onlay fusion with Halo immobilization remains an acceptable alternative. Allograft in the form of bone croutons or demineralised bone matrix is a significant risk factor for non-union and posterior iliac crest graft should be used preferentially.
APA, Harvard, Vancouver, ISO, and other styles
35

Olsen, Jens, and Torsten Muhrbeck. "Surgical Removal of Ameloblastoma and Keratocystic Odontogenic Tumors in Maxilla and Mandible, a Literature Review on Surgical Techniques and Risk of Recurrence." Thesis, Umeå universitet, Institutionen för odontologi, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-128183.

Full text
Abstract:
This literature review examines the literature on surgical management of ameloblastoma and keratocystic odontogenic tumours (KCOT). KCOT represent 3 % - 11 % of all the cystic lesions in the jaws and ameloblastoma 11 % of the odontogenic tumours. Treatment involves removal of the tumours by means of enucleation, curettage, marsupialization or resection. The first three can be combined with each other or with the adjunctive therapies: applications of Carnoy´s solution or cryotherapy. The aim of this literature review is to evaluate the risk of complications correlated to different surgical techniques for removal of KCOT or ameloblastoma. A search was performed in PubMed based on our keywords (Marsupialization, decompression, fenestration, enukleation, KCOT, OKC, KOT, keratocystic odontogenic tumor, odontogenic keratocyst, ameloblastoma, outcome, follow-up, relapse, prognosis, recurrence). The data was managed with Excel.  Twenty articles met our criteria: 12 articles reported KCOT in 667 patients and 8 articles reported 191 patients concerning Ameloblastoma. The articles almost exclusively presented the risk of recurrence for different treatment modalities. Subsequently the results mainly contain recurrence rates for different surgical techniques. 412 KCOT patients received enucleation alone and 92 recurred, resulting in a recurrence rate of 22.3 %. 91 patients with ameloblastoma received resection and four recurred, resulting in a recurrence rate of 4.4 %. This review fails to identify any reliable evidence on recurrence rates in relation to treatment modalities for KCOT and ameloblastoma. Further prospective controlled clinical trials are essential to address this important issue.
APA, Harvard, Vancouver, ISO, and other styles
36

Viana, Renato Frazzato. "Técnicas de classificação aplicadas a credit scoring: revisão sistemática e comparação." Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/104/104131/tde-18012017-112044/.

Full text
Abstract:
Com a crescente demanda por crédito é muito importante avaliar o risco de cada operação desse tipo. Portanto, ao fornecer crédito a um cliente é necessário avaliar as chances do cliente não pagar o empréstimo e, para esta tarefa, as técnicas de credit scoring são aplicadas. O presente trabalho apresenta uma revisão da literatura de credit scoring com o objetivo de fornecer uma vis~ao geral das várias técnicas empregadas. Além disso, um estudo de simulação computacional é realizado com o intuito de comparar o comportamento de várias técnicas apresentadas no estudo.
Nowadays the increasing amount of bank transactions and the increasing of data storage created a demand for risk evaluation associated with personal loans. It is very important for a company has a very good tools in credit risk evaluation because theses tools can avoid money losses. In this context, it is interesting estimate the default probability for a customers and, the credit scoring techniques are very useful for this task. This work presents a credit scoring literature review with and aim to give a overview covering many techniques employed in credit scoring and, a computational study is accomplished in order to compare some of the techniques seen in this text.
APA, Harvard, Vancouver, ISO, and other styles
37

Leung, Erika. "MODS and/or TLA techniques: a systematic review and meta-analysis for active tuberculosis diagnosis and an evaluation of their cost-effectiveness and feasibility." Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=97037.

Full text
Abstract:
OBJECTIVE: A systematic review and meta-analysis was performed to compare Microscopic-Observation Drug Sensitivity (MODS), Thin Layer Agar (TLA) and reference standards for sensitivity and specificity for tuberculosis detection and other characteristics. A questionnaire was conducted to evaluate the feasibility, costs and practical aspects of implementation of MODS/TLA.METHODS: A random effects meta-analysis to estimate the sensitivity and specificity was performed. A self-administered questionnaire was sent to laboratories using MODS/TLA.RESULTS: The overall sensitivity and specificity for MODS were 92% and 97% respectively and for TLA, they were 83% and 98% respectively. Equipment costs and training costs were moderate, costs for materials were low and labour costs were high.CONCLUSION: MODS and TLA appear to offer simple, inexpensive yet rapid and accurate diagnostic tools for active TB. Overall, costs were moderate to implement MODS/TLA. Important unresolved issues for further investigation include the cost-effectiveness and optimal methods for quality assurance of these TB diagnostic tools.
OBJECTIF: Une revue systématique et méta-analyse a été effectuée pour comparer Microscopic-Observation Drug Sensitivity (MODS), Thin Layer Agar (TLA) et normes de référence pour la sensibilité et la spécificité pour la détection de la tuberculose. Un questionnaire a été menée pour évaluer la faisabilité et les coûts de MODS/TLA.MÉTHODES: Un méta-analyse anx effets aléatoires pour estimer la sensibilité et la spécificité a été réalisée. Le questionnaire a été envoyé aux laboratoires utilisant MODS/TLA.RÉSULTATS: La sensibilité et la spécificité pour MODS étaient 92% et 97% respectivement et pour TLA, ils étaient 83% et 98% respectivement. Les coûts initiaux étaient modérées, les coûts des matériaux étaient faibles et les coûts salariaux étaient élevés.CONCLUSION: MODS/TLA sont des outils diagnostiques peu coûteux, rapides et précises. Les coûts pour la mise en oeuvre étaient modérés mais le coût-efficacité et la faisabilité pour les outils diagnostiques de TB sont deux domaines importants à étudier.
APA, Harvard, Vancouver, ISO, and other styles
38

Olorisade, Babatunde Kazeem. "Summarizing the Results of a Series of Experiments : Application to the Effectiveness of Three Software Evaluation Techniques." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3799.

Full text
Abstract:
Software quality has become and persistently remains a big issue among software users and developers. So, the importance of software evaluation cannot be overemphasized. An accepted fact in software engineering is that software must undergo evaluation process during development to ascertain and improve its quality level. In fact, there are too many techniques than a single developer could master, yet, it is impossible to be certain that software is free of defects. Therefore, it may not be realistic or cost effective to remove all software defects prior to product release. So, it is crucial for developers to be able to choose from available evaluation techniques, the one most suitable and likely to yield optimum quality results for different products - it bogs down to choosing the most appropriate for different situations. However, not much knowledge is available on the strengths and weaknesses of the available evaluation techniques. Most of the information related to the techniques available is focused on how to apply the techniques but not on the applicability conditions of the techniques – practical information, suitability, strengths, weaknesses etc. This research focuses on contributing to the available applicability knowledge of software evaluation techniques. More precisely, it focuses on code reading by stepwise abstraction as representative of the static technique, as well as equivalence partitioning (functional technique) and decision coverage (structural technique) as representatives of the dynamic technique. The specific focus of the research is to summarize the results of a series of experiments conducted to investigate the effectiveness of these techniques among other factors. By effectiveness in this research, we mean the potential of each of the techniques to generate test cases capable of revealing software faults in the case of the dynamic techniques or the ability of the static technique to generate abstractions that will aid the detection of faults. The experiments used two versions of three different programs with seven different faults seeded into each of the programs. This work uses the results of the eight different experiments performed and analyzed separately, to explore this fact. The analysis results were pooled together and jointly summarized in this research to extract a common knowledge from the experiments using a qualitative deduction approach created in this work as it was decided not to use formal aggregation at this stage. Since the experiments were performed by different researchers, in different years and in some cases at different site, there were several problems that have to be tackled in order to be able to summarize the results. Part of the problems is the fact that the data files exist in different languages, the structure of the files are different, different names is used for data fields, the analysis were done using different confidence level etc. The first step, taken at the inception of this research was to apply all the techniques to the programs used during the experiments in order to detect the faults. This purpose of this personal experience with the experiment is to be familiarized and get acquainted to the faults, failures, the programs and the experiment situations in general and also, to better understand the data as recorded from the experiments. Afterwards, the data files were recreated to conform to a uniform language, data meaning, file style and structure. A well structured directory was created to keep all the data, analysis and experiment files for all the experiments in the series. These steps paved the way for a feasible results synthesis. Using our method, the technique, program, fault, program – technique, program – fault and technique – fault were selected as main and interaction effects having significant knowledge relevant to the analysis summary result. The result, as reported in this thesis, indicated that the functional technique and the structural technique are equally effective as far as the programs and faults in these experiments are concerned. Both perform better than the code review. Also, the analysis revealed that the effectiveness of the techniques is influenced by the fault type and the program type. Some faults were found to exhibit better behavior with certain programs, some were better detected with certain techniques and even the techniques yield different result in different programs.
I can alternatively be contacted through: qasimbabatunde@yahoo.co.uk
APA, Harvard, Vancouver, ISO, and other styles
39

Leturcq, Alexandra. "Proportionnalité et droits fondamentaux : recherches comparées sur le travail du juge américain au regard des expériences canadienne, sud-africaine et de la Cour européenne des droits de l'homme." Thesis, Aix-Marseille, 2012. http://www.theses.fr/2012AIXM1006.

Full text
Abstract:
Il n'existe pas de principe général de proportionnalité en droit américain. Un contrôle est néanmoins présent dans la jurisprudence de la Cour suprême, bien qu'il ne soit pas toujours reconnu comme tel. Au vu des techniques utilisées à cette fin, une partie de la doctrine évoque une expérience exceptionnelle tandis qu'une autre relativise ce point de vue dans le domaine de la limitation des droits. La décision US c Carolene products de 1938 marque la fin d'une période d'interventionnisme judiciaire et constitue le point de départ de cette étude. Sous l'ère Lochner, l'invalidation quasi automatique des lois restreignant les libertés économiques valut à la Cour le qualificatif de « Gouvernement des juges ». Afin d'asseoir sa légitimité, elle élabora la doctrine des « degrés du contrôle » selon laquelle le standard de justification des atteintes dépend de la nature du droit restreint. Son travail est depuis lors rationalisé par la « contrainte substantielle » des droits fondamentaux qui participa à l'émergence d'une nouvelle théorie du contrôle de constitutionnalité. On peut observer que plusieurs juridictions s'autolimitent d'une façon comparable à leur homologue américain, à travers l'analyse des techniques du contrôle de proportionnalité. En particulier la Cour suprême du Canada, la Cour constitutionnelle sud-africaine et la Cour européenne des droits de l'homme, bien que chacune d'entre elles présente des spécificités en ce domaine. Les deux grands modes de « mise en balance » permettent alors de souligner les convergences et les divergences entre les systèmes
There is no general proportionality principle in the United States but, if not always recognized, the review appears in the Supreme Court's case law. For most part of the legal community its techniques reveal american exceptionalism. Other ones say this remark deserves some qualification in view of right's limitation. This study historically begin with the US v Carolene products case ending a period by which the Court invalidated most statutes restricting economic liberties. Thus, the Lochner Era was called « Government by the judiciary ». By the « levels of review » doctrine she found a way to prove her legitimacy, making the standard of justification depends on the nature of the right limited. This « substantial fundamental rights'constraint » rationalized her work and contributed towards a new theory of judicial review. However many jurisdictions share the same self-restraint as their american neighboor. With regard to the techniques of proportionality review, the canadian Supreme Court, the south african Constitutional Court and the European Court present several common characteristics in spite of their specific experience. Two modes of « balancing » highlight convergence and difference between those four legal systems. Stare decisis especially conditions methodological and normative coherence in the United states, having an influence on the fundamental right's constraint. It curbs differently the judicial expanding power of interpretation. According to a comparative perspective the american particularism should be revealed by their definition and their effect on a differentiated right's guarantee
APA, Harvard, Vancouver, ISO, and other styles
40

Khan, M. Shahan Ali, and Ahmad ElMadi. "Data Warehouse Testing : An Exploratory Study." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-4767.

Full text
Abstract:
Context. The use of data warehouses, a specialized class of information systems, by organizations all over the globe, has recently experienced dramatic increase. A Data Warehouse (DW) serves organiza-tions for various important purposes such as reporting uses, strategic decision making purposes, etc. Maintaining the quality of such systems is a difficult task as DWs are much more complex than ordi-nary operational software applications. Therefore, conventional methods of software testing cannot be applied on DW systems. Objectives. The objectives of this thesis study was to investigate the current state of the art in DW testing, to explore various DW testing tools and techniques and the challenges in DW testing and, to identify the improvement opportunities for DW testing process. Methods. This study consists of an exploratory and a confirmatory part. In the exploratory part, a Systematic Literature Review (SLR) followed by Snowball Sampling Technique (SST), a case study at a Swedish government organization and interviews were conducted. For the SLR, a number of article sources were used, including Compendex, Inspec, IEEE Explore, ACM Digital Library, Springer Link, Science Direct, Scopus etc. References in selected studies and citation databases were used for performing backward and forward SST, respectively. 44 primary studies were identified as a result of the SLR and SST. For the case study, interviews with 6 practitioners were conducted. Case study was followed by conducting 9 additional interviews, with practitioners from different organizations in Sweden and from other countries. Exploratory phase was followed by confirmatory phase, where the challenges, identified during the exploratory phase, were validated by conducting 3 more interviews with industry practitioners. Results. In this study we identified various challenges that are faced by the industry practitioners as well as various tools and testing techniques that are used for testing the DW systems. 47 challenges were found and a number of testing tools and techniques were found in the study. Classification of challenges was performed and improvement suggestions were made to address these challenges in order to reduce their impact. Only 8 of the challenges were found to be common for the industry and the literature studies. Conclusions. Most of the identified challenges were related to test data creation and to the need for tools for various purposes of DW testing. The rising trend of DW systems requires a standardized testing approach and tools that can help to save time by automating the testing process. While tools for operational software testing are available commercially as well as from the open source community, there is a lack of such tools for DW testing. It was also found that a number of challenges are also related to the management activities, such as lack of communication and challenges in DW testing budget estimation etc. We also identified a need for a comprehensive framework for testing data warehouse systems and tools that can help to automate the testing tasks. Moreover, it was found that the impact of management factors on the quality of DW systems should be measured.
Shahan (+46 736 46 81 54), Ahmad (+46 727 72 72 11)
APA, Harvard, Vancouver, ISO, and other styles
41

Dias, Clarissa Vilela Rodrigues Vieira de Carvalho. "Ultrassom para monitorização da estimulação ovariana controlada: revisão sistematizada e metanálise de estudos randomizados controlados." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/17/17145/tde-20072016-161026/.

Full text
Abstract:
Justificativa: As técnicas de reprodução assistida (TRA), usadas para o tratamento de infertilidade/subfertilidade, incluem manipulação in vitro de oócitos e esperma, ou embriões, com o objetivo de alcançar gravidez e nascimentos vivos. O recrutamento de múltiplos folículos é fundamental para o aumento das taxas de gravidez, e isso é alcançado por meio do estímulo ovariano controlado (EOC). A monitorização do EOC é realizada por contagem de folículos ovarianos e medidas ultrassonográficas, associadas ou não à dosagem hormonal. Justificase monitorar a fase folicular para decisões a respeito da dose de gonadotrofinas administradas, detecção do risco de ocorrência da síndrome de hiperestímulo ovariano (SHO) e do planejamento do triggering da maturação final; porém a necessidade da monitorização intensiva da EOC é controversa, pois a combinação dos métodos consome mais tempo, recursos e está associada com maior desconforto para a paciente. Objetivo: Avaliar a eficácia e segurança da monitorização da EOC em ciclos de reprodução assistida, usando somente ultrassonografia (US). Métodos de busca: As buscas por estudos randomizados foram realizadas nos principais bancos de dados eletrônicos. Além disso, foram examinadas, manualmente, as listas de referências dos estudos incluídos em revisões semelhantes. A última busca eletrônica foi realizada em 12 de março de 2015. Critérios de Seleção: Apenas estudos verdadeiramente randomizados, que comparassem a monitorização do EOC por US associado à dosagem hormonal e US isoladamente, monitorização do EOC por US2D e US3D, bem como US2D e telemonitorização endovaginal operada pela própria paciente (SOET), foram considerados elegíveis. Os estudos que permitiam a inclusão de uma mesma paciente duas vezes foram incluídos apenas se os dados do primeiro ciclo estivessem disponíveis. Coleta e Análise de Dados: Dois revisores avaliaram, independentemente, a elegibilidade, extração de dados e os riscos de viéses dos estudos incluídos. Quaisquer discordâncias foram resolvidas em consulta com um terceiro revisor. Quando necessário, os autores dos estudos incluídos foram contatados para maiores informações. Resultados: Foram selecionados 1717 registros, 10 dos quais eram elegíveis. Nenhum estudo relatou nascidos vivos. Seis estudos compararam a monitorização do EOC por US isolada com US associada à dosagem hormonal. Os intervalos de confiança (IC) foram extensos e não permitiram concluir a existência de benefício nem prejuízo associado ao uso de US isolada, em relação aos desfechos SHO (odds ratio - OR=1.03, IC95% 0.48 a 2.18, p=0.95) e abortamento (risco relativo - RR=0.37, IC95% 0.07 a 1.79, p=0.21). Para gravidez clínica, o IC foi compatível com pequeno benefício a pequeno prejuízo (RR=0.96, IC95% 0.80 a 1.16, p=0.70). Para número de oócitos captados, o IC foi compatível com apreciável benefício a não efeito (Diferença média MD=0.92 oócitos captados, CI95% -0.19 a 2.04, p=0.70). Dois estudos compararam US3D e US2D e os IC foram extensos e não permitiram concluir pela existência de benefício nem prejuízo associado à monitorização por US3D para os desfechos: gravidez clínica (RR=1.00, IC95% 0.58 a 1.73) e número de oócitos captados (MD= -0.37 oócitos, IC95% -3.63 a Resumo 2.89). Apenas um estudo comparou monitorização por US2D convencional com SOET, e o IC observado foi amplo e não permitiu concluir pela existência de benefício nem prejuízo associado à SOET, considerando se gravidez clínica (RR=0.95, IC 95% 0.52 a 1.75) e número de oócitos captados (MD=0.50, CI 95% - 2.13 a 3.13). Conclusão: No que concerne à eficácia, as evidências atuais sugerem que monitorizar o EOC apenas com US não deva alterar, substancialmente, as chances de se alcançar gravidez clínica. O número de oócitos captados é similar ao se comparar com a monitorização por US associada à dosagem hormonal. Quanto à segurança, também não houve aumento no risco de desenvolvimento de SHO. Contudo, a interpretação dos resultados deve ser realizada com cautela, já que para todos os desfechos e todas as comparações, os dados disponíveis são inconclusivos, pois a qualidade de evidência foi comprometida por imprecisão e falha dos estudos em relatar a metodologia aplicada. Por isso acredita-se que serão necessários mais estudos avaliando o procedimento ideal para monitorização da EOC
Background: The assisted reproductive techniques (ART) for the treatment of infertility/subfertility, include in vitro handling of both human oocytes and sperm or of embryos with the objective of achieving pregnancy and live birth. The recruitment of multiple follicles is often necessary for better results in pregnancy rates and it\'s achieved by performing controlled ovarian stimulation (COS). COS monitoring is performed by ovarian follicle counting and ultrasonography measurements and / or hormones dosage. It is appropriate to monitor the follicular phase for decisions regarding administered of gonadotropin dose, to assess the risk of ovarian hyperstimulation syndrome (OHSS), to determine the best time to trigger final follicular maturation. However, the need for intensive COS monitoring is controversial: the combination of the methods adds costs and discomfort for the woman who is undergoing ART and requires additional time. Objectives: To evaluate the efficacy and safety of monitoring controlled ovarian stimulation by ultrasound in assisted reproduced tecniques. Search Methods: The searches for randomized controlled trials (RCT) were performed in the main electronic databases; in addition, we hand searched the reference lists of included studies and similar reviews. We performed the last electronic search on March 29, 2015. Selection Criteria: Only truly randomized controlled trials comparing COS monitoring by ultrasonography and/or hormonal assessment, as studies comparing COS monitoring by 2DUS and 3DUS were considered eligible. We included studies that permitted the inclusion of the same participant more than once (cross-over or \'per cycle\' trials) only if data regarding the first treatment of each participant were available. Data Collection and Analysis: Two reviewers independently performed study eligibility, data extraction, and assessment of the risk of bias and we solved disagreements by consulting a third reviewer. We corresponded with study investigators in order to resolve any queries, as required. Results: The search retrieved 1717 records; ten studies were eligible. No study reported live birth. Six studies compared US only vs. US + Hormones. The confidence intervals (CI) were large and did not allow us conclude benefit or harm associated with the US Only for both OHSS (Odds ratio - OR=1.03, 95%CI 0.48 to 2.18, P=0.95), and miscarriage (relative risk - RR=0.37, 95%CI 0.07 to 1.79, P=0.21). For clinical pregnancy, the CI was compatible with small benefit to small harm (RR=0.96, 95%CI 0.80 to 1.16, P=0.70). For the number of oocytes retrieved, the CI was compatible with appreciable benefit to no effect (Mean difference - MD=0.92 oocytes, 95%CI -0.19 to 2.04, P=0.70). Two studies compared 3DUS vs. 2DUS: the confidence intervals (CI) were large and did not allow us conclude benefit or harm associated with 3DUS regarding clinical pregnancy (RR=1.00, CI95% 0.58 to 1.73) and number of oocytes retrieved (MD= -0.37 oocytes, 95%CI -3.63 to 2.89). One study compared 2DUS vs. SOET, the CI was large and did not allow us conclude benefit or harm associated with SOET regarding clinical pregnancy (RR=0.95, 95%CI 0.52 a 1.75) and number of oocytes retrieved (MD=0.50, 95%CI -2.13 a 3.13). Authors\' Conclusions: Regarding effectiveness, current evidence suggests that monitoring COS only by US only should not change substantially the chances of achieving clinical pregnancy. The number of retrieved oocytes is similar to compare with the monitoring by US associated with hormonal assessment. security also seems not to increase the risk of developing OHSS. However the interpretation of results should be performed with caution, since for all outcomes and comparisons, the available data are inconclusive because the quality of evidence was compromised by inaccuracy and poor reporting of study methodology. So we believe that further studies evaluating the ideal procedure for monitoring the COS are needed
APA, Harvard, Vancouver, ISO, and other styles
42

Souza, Thiago Ferreira de. "Revisão sistemática da literatura sobre as terapias endoscópicas ablativas do esôfago de Barrett." Universidade de São Paulo, 2011. http://www.teses.usp.br/teses/disponiveis/5/5154/tde-07102011-114259/.

Full text
Abstract:
O esôfago de Barrett é o principal fator de risco para o adenocarcinoma e resulta da agressão crônica causada pelo refluxo gastroesofágico. A abordagem terapêutica é controversa, e o tratamento cirúrgico, na presença de neoplasia intraepitelial de alto grau, pode estar indicado. A abordagem endoscópica apresenta-se como a alternativa com menor morbimortalidade e resultados favoráveis. Método: Realizou-se revisão sistemática nas bases de informação científica, com seleção de artigos randomizados e controlados, possibilitando metaanálise e avaliação isolada dos resultados das terapias ablativas da metaplasia intestinal. Considerou-se como terapias ablativas a crioterapia, laser, terapia fotodinâmica, eletrocoagulação multipolar, ablação por plasma de argônio e radiofrequência. Resultados: A revisão sistemática da literatura através do PUBMED recuperou os estudos com maior força de evidência e grau de recomendação disponíveis referentes ao tratamento ablativo do esôfago de Barrett. Nenhuma outra base de dados pesquisada adicionou outros artigos. Os artigos selecionados são estudos randomizados e controlados, classificados como A ou B pela tabela de Oxford. A terapia fotodinâmica apresenta no estudo meta-analítico aumento do risco de falha terapêutica em relação à ablação por plasma de argônio, NNH = -7. Os efeitos colaterais considerados foram dor torácica e estenose, com aumento do risco de dor torácica quando a ablação é realizada por terapia fotodinâmica e números semelhantes de estenoses para as duas terapias. A ablação do esôfago de Barrett por eletrocoagulação multipolar ou plasma de argônio apresenta risco de falhas terapêuticas semelhantes, assim como do efeito colateral de dor torácica na meta-análise. A terapia fotodinâmica associada ao inibidor da bomba de prótons apresenta benefício, em meta-análise, para a ablação do esôfago de Barrett em relação ao uso isolado do inibidor, NNT = 2. Dor torácica, estenose e fotossensibilidade estão associadas à intervenção endoscópica e, desta forma, a terapia fotodinâmica está associada ao aumento do risco com NNH entre - 2 e -3. Não houve aumento do risco de falha terapêutica entre a ablação do esôfago de Barrett por plasma de argônio e inibidor da bomba de prótons versus uso isolado do inibidor da bomba de prótons, em estudo isolado. A radiofrequência associada ao inibidor da bomba de prótons apresenta-se como método eficaz na redução do risco de falha terapêutica, NNT = 1. Conclusões: Não há estudos que demonstrem benefício na indicação da crioterapia ou laser para a abordagem endoscópica do esôfago de Barrett. A ablação por plasma de argônio apresenta eficácia superior à terapia fotodinâmica e a ablação por plasma de argônio e eletrocoagulação multipolar apresentam resultados efetivos e semelhantes. A terapia fotodinâmica apresenta menor número de falhas terapêuticas em relação ao uso isolado do inibidor da bomba de prótons. Não há dados suficientes para demonstrar a eficácia da ablação por plasma de argônio em relação ao uso isolado do inibidor da bomba de prótons. A radiofrequência é a abordagem mais recente e requer estudos comparativos para sua indicação. Os tratamentos endoscópicos ablativos estão associados a um maior risco de eventos colaterais como dor torácica, estenose e fotossensibilidade em relação ao tratamento clínico isolado, embora tais eventos não sejam graves ou limitantes
Barrett´s esophagus is the main risk factor for adenocarcinoma and it results from the chronic aggression produced by gastroesophageal reflux. Its therapeutic approach is controversial and surgical treatment in the presence of high-grade intraepithelial neoplasia may be indicated. Endoscopic approach is an alternative with lower mortality and morbidity rates and favorable results. Methods: A systematic review of scientific databases was conducted and articles of randomized, controlled studies were selected, enabling meta-analysis and the isolated evaluation of the results concerning ablative therapies of intestinal metaplasia. Cryotherapy, laser therapy, photodynamic therapy, multipolar electrocoagulation, and ablation through argon plasma coagulation and radiofrequency were considered ablation therapies. Results: The systematic review through PUBMED retrieved results with higher strength of evidence and available recommendation level regarding the ablative therapy of Barrett´s esophagus. No other database in the research yielded additional articles. The selected articles are randomized, controlled studies classified as A or B according to the Oxford table. Photodynamic therapy was found to present an increase in treatment failure compared to argon plasma coagulation in the meta-analysis, NNH = -7. Chest pain and stenosis were considered side effects with higher risk of chest pain whenever ablation is conducted through photodynamic therapy and similar figures of stenoses for both therapies. Ablation of Barrett´s esophagus with multipolar electrocoagulation or argon plasma was found to have similar risk of treatment failure, as well as the side effect of chest pain in the meta-analysis. Photodynamic therapy associated with proton pump inhibitor is beneficial, in the meta-analysis, for the ablation of Barrett´s esophagus regarding the isolated use of the inhibitor, NNT = 2. Chest pain, stenoses and photosensitivity were found to be associated with the endoscopic intervention and, therefore, photodynamic therapy was found to be associated with higher risk with NNH ranging between -2 and -3. No increase of treatment failure risk between Barrett´s esophagus ablation with argon plasma and proton pump inhibitor versus the isolated use of proton pump inhibitor was verified, in isolated study. Radiofrequency associated with proton pump inhibitor is an efficient method to reduce the risk of treatment failure, NNT = 1. Conclusions: There are no studies demonstrating the benefit of indicating cryotherapy or laser therapy for the endoscopic approach of Barrett´s esophagus. Ablation with argon plasma coagulation was found to have superior efficacy compared to photodynamic therapy and the ablation through argon plasma coagulation and multipolar electrocoagulation was found to present effective and similar results. Photodynamic therapy was found to have lower occurrences of treatment failure compared to the isolated use of proton pump inhibitor. No sufficient data were found to demonstrate the efficacy of ablation with argon plasma compared to the isolated use of proton pump inhibitor. Radiofrequency is the most recent approach and requires comparative studies in order to be indicated. Ablative endoscopic therapies are associated with higher risk of side effects, for instance chest pain, stenoses and photosensitivity compared to clinical treatment alone, although such events are neither severe nor limiting
APA, Harvard, Vancouver, ISO, and other styles
43

Frazzato, Viana Renato. "Técnicas de classificação aplicadas a credit scoring : revisão sistemática e comparação." Universidade Federal de São Carlos, 2015. https://repositorio.ufscar.br/handle/ufscar/7294.

Full text
Abstract:
Submitted by Luciana Sebin (lusebin@ufscar.br) on 2016-09-19T18:31:03Z No. of bitstreams: 1 DissRFV.pdf: 2859272 bytes, checksum: 4d67f29c51b595eea8e7a1fe15261706 (MD5)
Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2016-09-20T18:16:18Z (GMT) No. of bitstreams: 1 DissRFV.pdf: 2859272 bytes, checksum: 4d67f29c51b595eea8e7a1fe15261706 (MD5)
Approved for entry into archive by Marina Freitas (marinapf@ufscar.br) on 2016-09-20T18:16:25Z (GMT) No. of bitstreams: 1 DissRFV.pdf: 2859272 bytes, checksum: 4d67f29c51b595eea8e7a1fe15261706 (MD5)
Made available in DSpace on 2016-09-20T18:16:33Z (GMT). No. of bitstreams: 1 DissRFV.pdf: 2859272 bytes, checksum: 4d67f29c51b595eea8e7a1fe15261706 (MD5) Previous issue date: 2015-12-18
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Nowadays the increasing amount of bank transactions and the increasing of data storage created a demand for risk evaluation associated with personal loans. It is very important for a company has a very good tools in credit risk evaluation because theses tools can avoid money losses. In this context, it is interesting estimate the default probability for a customers and, the credit scoring techniques are very useful for this task. This work presents a credit scoring literature review with and aim to give a overview covering many techniques employed in credit scoring and, a computational study is accomplished in order to compare some of the techniques seen in this text.
Com a crescente demanda por cr edito e muito importante avaliar o risco de cada opera ção desse tipo. Portanto, ao fornecer cr edito a um cliente e necess ario avaliar as chances do cliente n~ao pagar o empr estimo e, para esta tarefa, as t ecnicas de credit scoring s~ao aplicadas. O presente trabalho apresenta uma revis~ao da literatura de credit scoring com o objetivo de fornecer uma vis~ao geral das v arias t ecnicas empregadas. Al em disso, um estudo de simula c~ao computacional e realizado com o intuito de comparar o comportamento de v arias t ecnicas apresentadas no estudo.
APA, Harvard, Vancouver, ISO, and other styles
44

Manzoor, Numan, and Umar Shahzad. "Information Visualization for Agile Development in Large‐Scale Organizations." Thesis, Blekinge Tekniska Högskola, Sektionen för datavetenskap och kommunikation, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-3094.

Full text
Abstract:
Context: Agile/lean development has been successful situations where small teams collaborate over long periods of time with project stakeholders. Unclear is how such teams plan and coordinate their work in such situations where inter-dependencies with other projects exist. In large organizations, scattered teams and complex team structure makes it difficult for every stakeholder to have a clear understanding of project information. These factors make it difficult for large‐scale organizations to adopt the agile/lean development paradigm. Objectives: The goal of conducting this study is to find the information visualization techniques that ease or resolve the challenges of agile development in large-scale organizations. The study reports the challenges of agile development and information visualization techniques in literature and reported by industrial experts. Additionally, proposed a guideline that how information visualization technique can be used to ease or resolve related challenge of agile development. Methods: For this particular study, two research methodologies are used; Systematic Literature Review (SLR) and Industrial Survey. Two SLRs are performed for finding 1) challenges of agile development and 2) information visualization techniques in agile development. Data sources like Engineering Village (Inspec/ Compendex), IEEE Explore digital library, ACM digital library, Science Direct, ISI-Web of knowledge; Scopus were used to select primary study. Industrial survey was conducted in order to obtain empirical evidence to our findings. In survey, mainly questions were related to challenges of agile development and information visualization techniques practiced by industrial experts. Results: 84 different challenges of agile development found in literature and by applying grounded theory we found 9 distinct categories of challenges. There were 55 challenges reported by industrial experts in survey which later grouped into 10 distinct challenges. 45 information visualization techniques found in literature and grouped into 21 distinct technologies. There were 47 different information visualization techniques reported by industrial experts. When we grouped these techniques there were 9 distinct technologies found by applying open, axial and selective coding of grounded theory Conclusions: Systematic Literature Review and Industrial Survey confirmed that information visualization techniques can be used to ease or resolve challenges of agile development. Along with other visualization techniques, Data Flow Diagrams, UML, Use Case Diagrams, Burn Down Charts, Scrum Story Board, Kanban Boards and Gantt Chart are highly reported techniques found through systematic literature review and later confirmed by industrial experts. On the other hand, through survey we found that industrial experts mainly rely on informal and customized information visualization techniques to visualize information.
APA, Harvard, Vancouver, ISO, and other styles
45

Ni, Suteng. "Review : integration of EMI technique with global vibration technique." Thesis, Massachusetts Institute of Technology, 2013. http://hdl.handle.net/1721.1/82821.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Civil and Environmental Engineering, 2013.
Cataloged from PDF version of thesis.
Includes bibliographical references (p. 48-50).
In the last decade, the development of Structural Health Monitoring (SHM) has been skyrocketing because of the serious consequences that come with structural failure. Traditional damage detection techniques, also known as local damage detections, such as visual inspection and ultrasonic testing, have been implemented since the mid 20th century. However, these techniques often require prior knowledge of potential damage locations and require bulky testing equipment. Alternative techniques, the Global Vibration Techniques, were first introduced to analyze the modal information of the structure to assess its overall health state. The drawback of these methods is their insensitivity towards the incipient local damage. With the development of sensor technology, a local damage detection technique, the Electro-Mechanical Impedance (EMI) method, has emerged. EMI measures the electrical admittance by the impedance analyzer, and evaluates the health status of the structure by comparing the baseline signature with the damaged signature. It allows users to access the structure remotely, but it loses its sensitivity when the damage is significant. Therefore, Bhalla, Shanker and Gupta proposed integrating the Global Vibration Techniques with the EMI technique so as to tap on the strengths of the respective techniques. This new method, the Integration of Global Vibration Technique and EMI Technique, draws on EMI's high sensitivity towards early incipient damage and Global Vibration Techniques' sensitivity at late damage stages. The author further examines the integrated method in terms of practicality and scalability. With considerations of some sensor related issues, the author would not suggest to apply the method to real structures.
by Suteng Ni.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
46

Zanardi, Volney. "A critical review of techniques used for the comparison of power generation systems on grounds of safety and environmental impacts and risks : incorporating case studies of coal and hydropower generation systems in southern Brazil." Thesis, University of East Anglia, 2001. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.249452.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Klafke, Guilherme Forma. "A interpretação conforme a constituição na doutrina brasileira: uma análise das relações entre conceitos e os limites à utilização da técnica." Universidade de São Paulo, 2015. http://www.teses.usp.br/teses/disponiveis/2/2134/tde-08042016-150951/.

Full text
Abstract:
O objeto desta pesquisa são as relações entre os conceitos de interpretação conforme a Constituição e os limites à utilização, segundo a doutrina brasileira. O propósito é verificar se e como os diferentes conceitos de interpretação conforme a Constituição empregados pelos autores influenciam seus argumentos em torno dos limites à utilização desse instrumento. Parte-se de um referencial teórico de análise conceitual, ancorado sobretudo nos trabalhos de Brian Bix e Andrew Halpin, para estruturar-se a identificação e a análise dos diferentes conceitos presentes na literatura. Opta-se pela apresentação das ideias dos autores mais influentes na doutrina nacional, selecionados a partir de um critério de número de citações e especialidade da obra. Ao final, conclui-se que: (a) os autores utilizam ao menos nove conceitos diferentes: (i) como interpretação orientada pela Constituição, (ii) integração conforme a Constituição, critério normativo (iii) com e (iv) sem declaração de inconstitucionalidade, (v) interpretação extensiva ou restritiva de acordo com a Constituição, (vi) declaração de inconstitucionalidade da interpretação em concreto, (vii) um tipo de decisão com estrutura específica e (viii) um tipo de dispositivo decisório específico; (ix) um argumento retórico; (b) que os autores geralmente mencionam os mesmos limites, independentemente do conceito que adotam; (c) que, no entanto, os limites não se adequam da mesma forma a todos os conceitos empregados. Sustenta-se que o esclarecimento dessas relações é capaz de aumentar a compreensão sobre o tema, inclusive para os fins de análise da jurisprudência.
The main purpose of this research is to establish a relationship between the concept of interpretation in harmony with the Constitution and the limits of its usage, according to the Brazilian legal doctrine. The analysis aims to verify the influence of the different concepts of interpretation in harmony with the Constitution on the motivation adopted by authors to justify the limits of its usage. The starting point is a theoretical framework of conceptual analysis, mainly based on the work of Brian Bix and Andrew Halpin, in order to organize the different existing concepts. The methodology was to examine the ideas of several authors, so the criteria to choose among the most prominent authors in the national legal doctrine were the number of times one was quoted and the specificity of ones work. Finally, the research concludes that: (a) the authors adopt at least nine different concepts such as: (i) constitutional guided interpretation, (ii) constitutional based integration, normative standard (iii) with and (iv) without the declaration of unconstitutionality, (v) extensive or restrictive statutory interpretation based on the Constitution, (vi) the declaration of unconstitutionality of a particular interpretation, (vii) a decision with a specific type of structure, (viii) a particular type of decision-making device and (ix) a rethorical argument; (b) the authors often mention the same limits, regardless of the concept that they adopt; (c) the limits do not apply the same way to all the existing concepts. It is argued that the clarification of these relationships is able to increase the understanding of the topic, especially for the purpose of analysis of the jurisprudence.
APA, Harvard, Vancouver, ISO, and other styles
48

Chanamolu, Charitha. "REVIEWS TO RATING CONVERSION AND ANALYSIS USING MACHINE LEARNING TECHNIQUES." CSUSB ScholarWorks, 2019. https://scholarworks.lib.csusb.edu/etd/792.

Full text
Abstract:
With the advent of technology in recent years, people depend more on online reviews to purchase a product. It is hard to determine whether the product is good or bad from hundreds of mixed reviews. Also, it is very time-consuming to read many reviews. So, opinion mining of reviews is necessary. The main aim of this project is to convert the reviews of a product into a rating and to evaluate the ratings using machine learning algorithms such as Naïve Bayes and Support Vector Machine. In the process of converting the reviews to a rating, score words are created using SentiWordNet and transformed into seven categories from highly positive to highly negative.
APA, Harvard, Vancouver, ISO, and other styles
49

Zou, Hai Tao. "Local topology of social networks in supporting recommendations and diversity identification of reviews." Thesis, University of Macau, 2015. http://umaclib3.umac.mo/record=b3335434.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Ma, Qiao. "The effectiveness of requirements prioritization techniques for a medium to large number of requirements a systematic literature review : a dissertation submitted to Auckland University of Technology in partial fulfilment of the requirements for the degree of Master of Computer and Information Sciences (MCIS), 2009 /." Click here to access this resource online, 2009. http://hdl.handle.net/10292/833.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography