To see the other types of publications on this topic, follow the link: Standard deviation.

Dissertations / Theses on the topic 'Standard deviation'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Standard deviation.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Lima, Mariana Braga de. "Can deviation from standard beauty become appealing?: an age perspective." reponame:Repositório Institucional do FGV, 2015. http://hdl.handle.net/10438/14589.

Full text
Abstract:
Submitted by Mariana Lima (mariana.lima@fgvmail.br) on 2015-11-26T21:02:54Z No. of bitstreams: 1 Dissertação Mariana Braga de Lima Mex2014.pdf: 14412514 bytes, checksum: c1117481eece896b474e8ef644ae0f52 (MD5)
Approved for entry into archive by Janete de Oliveira Feitosa (janete.feitosa@fgv.br) on 2015-12-07T12:00:56Z (GMT) No. of bitstreams: 1 Dissertação Mariana Braga de Lima Mex2014.pdf: 14412514 bytes, checksum: c1117481eece896b474e8ef644ae0f52 (MD5)
Approved for entry into archive by Maria Almeida (maria.socorro@fgv.br) on 2015-12-09T18:52:34Z (GMT) No. of bitstreams: 1 Dissertação Mariana Braga de Lima Mex2014.pdf: 14412514 bytes, checksum: c1117481eece896b474e8ef644ae0f52 (MD5)
Made available in DSpace on 2015-12-09T18:52:57Z (GMT). No. of bitstreams: 1 Dissertação Mariana Braga de Lima Mex2014.pdf: 14412514 bytes, checksum: c1117481eece896b474e8ef644ae0f52 (MD5) Previous issue date: 2015-11-10
When exploring new perspectives on the impact of non-idealized vs. idealized body image in advertising, studies have focused mainly on body size, i.e., thin vs. heavy (Antioco et al., 2012; Smeesters & Mandel, 2006). Age remains largely unexplored, and the vast majority of ads in the market depict young models. The purpose of this research is therefore to investigate which images in advertisements – young or mature models – are more persuasive for older women (40+ years old). In this investigation, two studies were conducted. The first part was an exploratory analysis with a qualitative approach, which in turn helped to formulate the hypothesis tested in the subsequent experiment. The results of the in-depth interviews suggested a conflict over notions of imprisonment (need to follow beauty standards) and freedom (wish to deviate). The results of the experiment showed essentially that among older consumers, ads portraying older models were as persuasive as ads portraying younger models. Limitations and future research are discussed.
APA, Harvard, Vancouver, ISO, and other styles
2

Bergstrom, Tom, and Patrik Carlsson. "Diversification Attributes of Dutch REITs During Recessions:Return, Standard Deviation and Liquidity Characteristics." Thesis, KTH, Fastigheter och byggande, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-277894.

Full text
Abstract:
The objective of this thesis is to determine the performance of Dutch REITs and liquidity aspects during recessions and economic upswings as well as correlation with other asset classes to gain further knowledge in the field ofreal estate investment and asset performance during certain time periods. This is achieved through a quantitative analysis of historical daily returns, standard deviation and transaction volume of shares regarding REITs and other assets that usually pertain to an investor’s portfolio. The analysis covers the time-period just prior to the global financial crisis up until the beginning of the financial crisis caused by Covid-19. Analyzed data display some correlation between REITs and other assets. However, the data still implies some diversification benefits ofincorporating REITs in a portfolio during all economic states through the time periods with both the objective tominimize risk i.e. standard deviation, and to maximize return. The liquidity results on offset-time efficiency is comparable with other assets stock, which suggests that REITs are as liquid as other stock and thus is more liquid than direct real estate investments. In conclusion the data does support some diversification benefits of including Dutch REITs in a Netherlands-based investment portfolio. However, to what extent can not be determined, in part because of individual investors preferences, beliefs, and behavior, but also because of additional factors, such as dividends, that affect the value of REITs to an actual investor.
Syftet med arbetet är att fastställa Nederländska REITs prestanda och likviditet under lågkonjunkturer och ekonomiska uppgångar samt korrelationen med andra tillgångsklasser för att få ytterligare kunskap inom området fastighetsinvesteringar och investeringstillgångars prestanda under vissa tidsperioder. Detta uppnås genom en kvantitativ analys av historisk marknadsutveckling, standardavvikelse samt transaktionsvolym av antalet aktier. Det är REITs tillsammans med tillgångar som vanligtvis hänför sig till en investerares portfölj som har undersökts. Analysen behandlar ett tidsspann från perioden strax innan finanskrisen 2008 fram till början av den finansiellakrisen orsakad av Covid-19. Analyserad data visar viss korrelation mellan REITs och de andra tillgångarna, men innebär fortfarande vissa fördelar med att inkludera REITs i en portfölj under samtliga ekonomiska tillstånd under samtliga tidsperioderna. Både avseende syftet att minimera risken, som i detta fall utgörs av standardavvikelse,samt för att maximera avkastningen. Likviditetsstudien visar att avyttringshastigheten är jämförbar med andra aktietillgångar, vilket antyder att REIT är lika likvida som andra aktier och därmed är mer likvida än direkta fastighetsinvesteringar. Sammanfattningsvis stödjer resultaten vissa diversifikationsfördelar med att inkludera Nederländska REITs i en Nederländskt baserad investeringsportfölj. Men i vilken utsträckning kan inte fastställas,delvis på grund av enskilda investerares preferenser, övertygelser och beteende, men också på grund av ytterligare faktorer, såsom utdelning, som påverkar värdet av REIT till en faktisk investerare.
APA, Harvard, Vancouver, ISO, and other styles
3

Rickardsson, Elin. "Calculated deviation : A case study at Arkivator." Thesis, University of Skövde, School of Technology and Society, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:his:diva-3176.

Full text
Abstract:

En kalkylavvikelse kan uppstå vid jämförelse mellan förkalkylen och efterkalkylen på en order. Detta examensarbetet avser att undersöka vilka orsaker som kan ligga bakom en kalkylavvikelse. I samband med examensarbetet har en fallstudie utförts på Arkivator i Falköping, där en avvikelseanalys har genomförts på två av deras tillverkningsordrar. Den insamlingsmetod som har använts i studien är personliga intervjuer med ekonomichefen och produktionscontrollern på företaget. De slutsatser som kunde dras av studien var att det förekommer flertalet olika orsaker till att en kalkylavvikelse uppstår. De huvudsakliga orsakerna till kalkylavvikelserna på Arkivator var att en produkt fick kasseras under produktionen och att en produktionsgrupp troligtvis glömt att avrapportera när de färdigställt ordern. För att i framtiden kunna upprätta bättre för- och efterkalkyler är det viktigt att följa upp de kalkylavvikelser som har uppstått och dra lärdom av dessa.


A calculation discrepancy might arise from the comparison between preliminary calculations and post-costing of an order. This essay intends to explore the reasons behind calculated deviations. In connection with the thesis is a case study conducted at Arkivator in Falköping, where a deviation analysis carried out on two of their manufacturing orders. The method thats used for the collection is personal interviews with the treasurer and the production controller of the company. The conclusions from the study was that there are several different reasons why a calculation difference arises. The main reasons for the differences at Arkivator was that one product was discarded during the production and that a production team probably forgot to report when thy completed the order. Whereas, in order to be able to establish better preliminary- and post-calculations, it is important to follow up the calculation differences that have arisen and take knowledge of them.

APA, Harvard, Vancouver, ISO, and other styles
4

Jones, G. J. "A study of the surface finish produced by grinding." Thesis, Brunel University, 1985. http://bura.brunel.ac.uk/handle/2438/4893.

Full text
Abstract:
A survey of the literature of grinding and surface texture shows the influence of dressing and wear on surfaces involved in the process and the advantages of stylus profilometry for data collection from both grinding wheels and ground surfaces. Statistical analysis is favoured for surface profile characterization and, of the various parameters used, power spectral density alone offers some prospect of effective comparison between these surfaces. Work on grinding with single crystals of natural corundum was eventually discontinued in favour of experiments with conventional bonded grinding wheels subjected to a dressing operation and some wear in grinding steel surfaces. Statistical parameters representing the surfaces are computed using data obtained from profilograms. Results in terms of power spectral density are presented showing progressive improvement following upon developments in apparatus and methods which facilitated the use of larger surface profile samples. Transfer functions are used to relate power spectra representing corresponding pairs of surfaces. The significance of power spectral density applied to surface profile characterization is discussed and, in this context, it is suggested that these should be described as variance spectra. Attention is drawn to certain disadvantages of variance spectra applied to grinding wheel and ground surface profiles. Methods designed to improve presentation of variance spectra lead to development of a proposed new and more suitable spectrum in which density of standard deviation of surface profile ordinates with respect to frequency is plotted against frequency. Transfer functions calculated from related pairs of these standard deviation spectra show a strong linear correlation with frequency and offer prospects of convenient comparison between the profiles of the various surfaces involved in grinding.
APA, Harvard, Vancouver, ISO, and other styles
5

Ford, Anna. "Efficacy of change in body mass index standard deviation score for improving the cardiometabolic status of children and adolescents with obesity." Thesis, University of Bristol, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.500398.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Bencová, Monika. "Využití controllingu v podniku." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2020. http://www.nusl.cz/ntk/nusl-417395.

Full text
Abstract:
The purpose of the thesis is to describe controlling and its function in a real company. Focus is specifically on the cost of imbalances and the analysis of their origin. The theoretical part serves as a basis for understanding the real processes in a company, followed by their evaluation and proposals for improvements in the scope of cost management.
APA, Harvard, Vancouver, ISO, and other styles
7

Schneider, Harald Jörn, Bernhard Saller, Jens Klotsche, Winfried März, Wolfgang Erwa, Hans-Ulrich Wittchen, and Günter Karl Stalla. "Opposite associations of age-dependent insulin-like growth factor-I standard deviation scores with nutritional state in normal weight and obese subjects." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2013. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-100946.

Full text
Abstract:
Objective: Insulin-like growth factor-I (IGF-I) has been suggested to be a prognostic marker for the development of cancer and, more recently, cardiovascular disease. These diseases are closely linked to obesity, but reports of the association of IGF-I with measures of obesity are divergent. In this study, we assessed the association of age-dependent IGF-I standard deviation scores with body mass index (BMI) and intra-abdominal fat accumulation in a large population. Design: A cross-sectional, epidemiological study. Methods: IGF-I levels were measured with an automated chemiluminescence assay system in 6282 patients from the DETECT study. Weight, height, and waist and hip circumference were measured according to the written instructions. Standard deviation scores (SDS), correcting IGF-I levels for age, were calculated and were used for further analyses. Results: An inverse U-shaped association of IGF-I SDS with BMI, waist circumference, and the ratio of waist circumference to height was found. BMI was positively associated with IGF-I SDS in normal weight subjects, and negatively associated in obese subjects. The highest mean IGF-I SDS were seen at a BMI of 22.5–25 kg/m2 in men (+0.08), and at a BMI of 27.5–30 kg/m2 in women (+0.21). Multiple linear regression models, controlling for different diseases, medications and risk conditions, revealed a significant negative association of BMI with IGF-I SDS. BMI contributed most to the additional explained variance to the other health conditions. Conclusions: IGF-I standard deviation scores are decreased in obesity and underweight subjects. These interactions should be taken into account when analyzing the association of IGF-I with diseases and risk conditions.
APA, Harvard, Vancouver, ISO, and other styles
8

Manseck, Andreas, Christian Pilarsky, Stefan E. Froschermaier, Mario Menschikowski, and Manfred P. Wirth. "Diagnostic Significance of Prostate-Specific Antigen Velocity at Intermediate PSA Serum Levels in Relation to the Standard Deviation of Different Test Systems." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-133947.

Full text
Abstract:
Serial prostate-specific antigen (PSA) measurements (PSA velocity) as an additional instrument to detect prostatic cancer was introduced in 1992. It has previously been reported that PSA increase per year differed in the last 5 years prior to diagnosis in patients with benign prostatic hyperplasia (0.18 ng/ml/year), locally confined (0.75 ng/ml/year) and metastasized (4.4 ng/ml/year) cancer of the prostate (CaP) in contrast to healthy men (0.04 ng/ml/year). The ability of PSA velocity to detect organ-confined CaP in patients with intermediate PSA serum values depends therefore on a reliable and reproducible PSA result. The present study comprised 85 men with PSA values between 3 and 8 ng/ml (Abbott IMx). PSA measurements were repeated with Abbott IMx (n = 85 patients) and Hybritech Tandem-E (n = 59 patients) assays. The PSA serum values differed from one examination to the other from 0.02 to 2.74 ng/ml with the Abbott IMx. Standard deviation amounted to 0.35 ng/ml with the Abbott IMx PSA assay. Using the Hybritech Tandem-E assay, mean standard deviation was 1.15 ng/ml and therefore higher than with the Abbott IMx assay. The difference from one test to the other ranged from 0.05 to 4.05 ng/ml with the Hybritech Tandem-E. Using the Abbott IMx assay, 10.6% of all repeat measurements exceeded 1 ng/ml whereas in the Hybritech Tandem-E assay 62.7% of the second measurements differed >1 ng/ml from the first PSA result. An increase in PSA serum values may therefore be due to intratest variation, physiological day-to-day variation as well as prostatic disease. It is important to notice that the intra-assay variation may be greater than the PSA increase per year in a patient with CaP. Therefore, PSA velocity seems to be of limited value
Dieser Beitrag ist mit Zustimmung des Rechteinhabers aufgrund einer (DFG-geförderten) Allianz- bzw. Nationallizenz frei zugänglich
APA, Harvard, Vancouver, ISO, and other styles
9

Schneider, Harald Jörn, Bernhard Saller, Jens Klotsche, Winfried März, Wolfgang Erwa, Hans-Ulrich Wittchen, and Günter Karl Stalla. "Opposite associations of age-dependent insulin-like growth factor-I standard deviation scores with nutritional state in normal weight and obese subjects." BioScientifica, 2006. https://tud.qucosa.de/id/qucosa%3A26325.

Full text
Abstract:
Objective: Insulin-like growth factor-I (IGF-I) has been suggested to be a prognostic marker for the development of cancer and, more recently, cardiovascular disease. These diseases are closely linked to obesity, but reports of the association of IGF-I with measures of obesity are divergent. In this study, we assessed the association of age-dependent IGF-I standard deviation scores with body mass index (BMI) and intra-abdominal fat accumulation in a large population. Design: A cross-sectional, epidemiological study. Methods: IGF-I levels were measured with an automated chemiluminescence assay system in 6282 patients from the DETECT study. Weight, height, and waist and hip circumference were measured according to the written instructions. Standard deviation scores (SDS), correcting IGF-I levels for age, were calculated and were used for further analyses. Results: An inverse U-shaped association of IGF-I SDS with BMI, waist circumference, and the ratio of waist circumference to height was found. BMI was positively associated with IGF-I SDS in normal weight subjects, and negatively associated in obese subjects. The highest mean IGF-I SDS were seen at a BMI of 22.5–25 kg/m2 in men (+0.08), and at a BMI of 27.5–30 kg/m2 in women (+0.21). Multiple linear regression models, controlling for different diseases, medications and risk conditions, revealed a significant negative association of BMI with IGF-I SDS. BMI contributed most to the additional explained variance to the other health conditions. Conclusions: IGF-I standard deviation scores are decreased in obesity and underweight subjects. These interactions should be taken into account when analyzing the association of IGF-I with diseases and risk conditions.
APA, Harvard, Vancouver, ISO, and other styles
10

Manseck, Andreas, Christian Pilarsky, Stefan E. Froschermaier, Mario Menschikowski, and Manfred P. Wirth. "Diagnostic Significance of Prostate-Specific Antigen Velocity at Intermediate PSA Serum Levels in Relation to the Standard Deviation of Different Test Systems." Karger, 1998. https://tud.qucosa.de/id/qucosa%3A27551.

Full text
Abstract:
Serial prostate-specific antigen (PSA) measurements (PSA velocity) as an additional instrument to detect prostatic cancer was introduced in 1992. It has previously been reported that PSA increase per year differed in the last 5 years prior to diagnosis in patients with benign prostatic hyperplasia (0.18 ng/ml/year), locally confined (0.75 ng/ml/year) and metastasized (4.4 ng/ml/year) cancer of the prostate (CaP) in contrast to healthy men (0.04 ng/ml/year). The ability of PSA velocity to detect organ-confined CaP in patients with intermediate PSA serum values depends therefore on a reliable and reproducible PSA result. The present study comprised 85 men with PSA values between 3 and 8 ng/ml (Abbott IMx). PSA measurements were repeated with Abbott IMx (n = 85 patients) and Hybritech Tandem-E (n = 59 patients) assays. The PSA serum values differed from one examination to the other from 0.02 to 2.74 ng/ml with the Abbott IMx. Standard deviation amounted to 0.35 ng/ml with the Abbott IMx PSA assay. Using the Hybritech Tandem-E assay, mean standard deviation was 1.15 ng/ml and therefore higher than with the Abbott IMx assay. The difference from one test to the other ranged from 0.05 to 4.05 ng/ml with the Hybritech Tandem-E. Using the Abbott IMx assay, 10.6% of all repeat measurements exceeded 1 ng/ml whereas in the Hybritech Tandem-E assay 62.7% of the second measurements differed >1 ng/ml from the first PSA result. An increase in PSA serum values may therefore be due to intratest variation, physiological day-to-day variation as well as prostatic disease. It is important to notice that the intra-assay variation may be greater than the PSA increase per year in a patient with CaP. Therefore, PSA velocity seems to be of limited value.
Dieser Beitrag ist mit Zustimmung des Rechteinhabers aufgrund einer (DFG-geförderten) Allianz- bzw. Nationallizenz frei zugänglich.
APA, Harvard, Vancouver, ISO, and other styles
11

Eliasson, Elin, and Charlotta Karlsson. "Nöjda kunder med risken i fokus : En studie i hur finansiell risk bör förmedlas." Thesis, Linköping University, Department of Management and Economics, 2004. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-2408.

Full text
Abstract:

During the last decades major changes has occurred at the financial markets, meaning an increasing supply and a greater variation of financial instruments. The saving habits of the Swedish people have gone from traditional bank deposits to investments in equities, funds and bonds. All this together with the great rise in the stock market at the late 90’s has brought words like risk and return up-to-date, and is the background to the development of a new law concerning financial advising which come into force the 1th of July 2004.

The contents of the thesis can be described as three bricks, representing the survey questions. The thesis starts with descriptions of which risk- and return concepts that exists and which are used by the contemporary financial institutions. Further on, the thesis deals with the individuals’ perception of risk, in particular financial risk. To end with, details regarding how a message should be conveyed are given. The three bricks together fulfil the purpose of the thesis; To investigate how the meaning of financial risk in a simple and pedagogical way can be explained to a person not familiar with financial literature, and to develop questions that facilitate when an individuals risk profile is ascertained.

We have found that standard deviation is the risk concept that dominates in financial theory, and together with Value at Risk is the most common used in practise. Good knowledge about risk is required when explaining risk. It is important to describe the information in an attractive way and use examples and illustrations. For financial advisers it also is important to have knowledge about the human behaviour, because ascertain the clients risk profile is an important part of the risk explanation. A clients risk profile is best ascertain with so called open questions where both what the clients answer and how he or she answer can form the basis for the judgement.

APA, Harvard, Vancouver, ISO, and other styles
12

Davidson, Fiona. "Predicting Glass Sponge (Porifera, Hexactinellida) Distributions in the North Pacific Ocean and Spatially Quantifying Model Uncertainty." Thesis, Université d'Ottawa / University of Ottawa, 2020. http://hdl.handle.net/10393/40028.

Full text
Abstract:
Predictions of species’ ranges from distribution modeling are often used to inform marine management and conservation efforts, but few studies justify the model selected or quantify the uncertainty of the model predictions in a spatial manner. This thesis employs a multi-model, multi-area SDM analysis to develop a higher certainty in the predictions where similarities exist across models and areas. Partial dependence plots and variable importance rankings were shown to be useful in producing further certainty in the results. The modeling indicated that glass sponges (Hexactinellida) are most likely to exist within the North Pacific Ocean where alkalinity is greater than 2.2 μmol l-1 and dissolved oxygen is lower than 2 ml l-1. Silicate was also found to be an important environmental predictor. All areas, except Hecate Strait, indicated that high glass sponge probability of presence coincided with silicate values of 150 μmol l-1 and over, although lower values in Hecate Strait confirmed that sponges can exist in areas with silicate values of as low as 40 μmol l-1. Three methods of showing spatial uncertainty of model predictions were presented: the standard error (SE) of a binomial GLM, the standard deviation of predictions made from 200 bootstrapped GLM models, and the standard deviation of eight commonly used SDM algorithms. Certain areas with few input data points or extreme ranges of predictor variables were highlighted by these methods as having high uncertainty. Such areas should be treated cautiously regardless of the overall accuracy of the model as indicated by accuracy metrics (AUC, TSS), and such areas could be targeted for future data collection. The uncertainty metrics produced by the multi-model SE varied from the GLM SE and the bootstrapped GLM. The uncertainty was lowest where models predicted low probability of presence and highest where the models predicted high probability of presence and these predictions differed slightly, indicating high confidence in where the models predicted the sponges would not exist.
APA, Harvard, Vancouver, ISO, and other styles
13

Johansson, Rydell Marta, and Rosenblad Lisa Vendela. "Prissättningsmetoder vid börsintroduktioner : En studie om volatilitet och avkastning." Thesis, Linköpings universitet, Institutionen för ekonomisk och industriell utveckling, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-72858.

Full text
Abstract:
Bakgrund/Motiv: Historiskt sett tillämpades vanligen fast prissättning vid börsintroduktioner vilket innebar att aktierna ofta blev underprissatta och det var lätt för investerare att generera hög avkastning första handelsdagen. Numera används i större utsträckning anbudsförfarande och intervallprissättning där investerare lämnar anbud om pris och antal vilket har minskat underprissättningen. Studien utgår från att en del av marknadens förväntningar inkluderas i priset vid intervallprissättning vilket i sin tur skulle minska aktiens volatilitet efter introduktion. Syfte: Syftet med denna studie är att undersöka om aktiens volatilitet skiljer sig efter att den introducerats beroende på vilken av de två metoderna som använts för att prissätta aktien samt hur val av prissättningsmetod påverkar en aktiens underprissättning och avkastning efter introduktionen. Genomförande: Studien består av kvantitativa historiska data i form av aktiekurser och övrig information från de prospekt som upprättats i samband med bolagens introduktion på börsen. Utöver bearbetning av data och analyser i Excel har ett flertal ekonometriska tester genomförts med hjälp av ickelinjära regressionsanalyser där prissättningsmetod, betavärde, underprissättning och varians testats som beroende variabel mot ett flertal kombinationer av förklarande variabler. Slutsats: Studien visar att bolag som tillämpat fast prissättning uppvisar högre volatilitet efter börsintroduktion och att valet av prissättningsmetod därmed har en viss påverkan på volatiliteten. Vidare kan det konstateras att dessa bolag generellt varit mer underprissatta och genererat högre avkastning det första handelsåret.
Background: In the past, most companies performing an Initial Public Offering, IPO, applied the fixed pricing method, which often lead to an extensive underpricing of the shares. By doing so, it was easy for investors to gain high return on the first trading day. Nowadays, companies use auction pricing to a greater extent where investors bid for a certain amount of shares to a certain price. This procedure has resulted in a decrease of the underpricing. With the assumption that some of the market’s expectations are included in the price, whilst using an auction pricing method, these stocks would possibly appear less volatile after the IPO. Purpose: The aim of this study is to investigate whether the volatility of the shares is different after the introduction on the market, based on which method that has been applied when pricing the shares. The thesis also investigates to what extent the choice of pricing method influences the underpricing and returns of a share after its introduction. Method: The study comprises quantitative historical data, such as share prices as well as additional information gathered from the prospectus of each IPO. In addition to arranging the data and the analyses, made in Excel, numerous econometric analyses have been made by using non-linear regressions, where variables such as pricing method, beta, underpricing on the first trading day, and variance have been examined as a dependent variable in relation to several different combinations of explanatory variables. Findings: The study finds that companies that have practiced a fixed pricing method show a higher volatility after the introduction on the market. Thus, the choice of either pricing method has some influence on the volatility. Furthermore, it was proved that companies using a fixed pricing method were more underpriced and gained higher returns during the first year of trading compared to companies using an auction pricing method.
APA, Harvard, Vancouver, ISO, and other styles
14

Hamrin, Erik. "A Heuristic Downside Risk Approach to Real Estate Portfolio Structuring : a Comparison Between Modern Portfolio Theory and Post Modern Portfolio Theory." Thesis, KTH, Bygg- och fastighetsekonomi, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-89812.

Full text
Abstract:
Portfolio diversification has been a subject frequently addressed since the publications of Markowitz in 1952 and 1959. However, the Modern Portfolio Theory and its mean variance framework have been criticized. The critiques refer to the assumptions that return distributions are normally distributed and the symmetric definition of risk. This paper elaborates on these short comings and applies a heuristic downside risk approach to avoid the pitfalls inherent in the mean variance framework. The result of the downside risk approach is compared and contrasted with the result of the mean variance framework. The return data refers to the real estate sector in Sweden and diversification is reached through property type and geographical location. The result reveals that diversification is reached differently between the two approaches. The downside risk measure applied here frequently diversifies successfully with use of fewer proxies. The efficient portfolios derived also reveals that the downside risk approach would have contributed to a historically higher average total return. This paper outlines a framework for portfolio diversification, the result is empirical and further research is needed in order to grasp the potential of the downside risk measures.
APA, Harvard, Vancouver, ISO, and other styles
15

Wahabi, Abdoul Rassaki. "Resource management in IP networks." Thesis, Stellenbosch : Stellenbosch University, 2001. http://hdl.handle.net/10019.1/52436.

Full text
Abstract:
Thesis (MSc)--University of Stellenbosch, 2001.
ENGLISH ABSTRACT: lP networks offer scalability and flexibility for rapid deployment of value added lP services. However, with the increased demand and explosive growth of the Internet, carriers require a network infrastructure that is dependable, predictable, and offers consistent network performance. This thesis examines the functionality, performance and implementation aspects of the MPLS mechanisms to minimize the expected packet delay in MPLS networks. Optimal path selection and the assignment of bandwidth to those paths for minimizing the average packet delay are investigated. We present an efficient flow deviation algorithm (EFDA) which assigns a small amount of flow from a set of routes connecting each OD pair to the shortest path connecting the OD pair in the network. The flow is assigned in such a way that the network average packet delay is minimized. Bellman's algorithm is used to find the shortest routes between all OD pairs. The thesis studies the problem of determining the routes between an OD pair and assigning capacities to those routes. The EFDA algorithm iteratively determines the global minimum of the objective function. We also use the optimal flows to compute the optimal link capacities in both single and multirate networks. The algorithm has been applied to several examples and to different models of networks. The results are used to evaluate the performance of the EFDA algorithm and compare the optimal solutions obtained with different starting topologies and different techniques. They all fall within a close cost-performance range. They are all within the same range from the optimal solution as well.
AFRIKAANSE OPSOMMING: lP-netwerke voorsien die skaleerbaarheid en buigsaamheid vir die vinnige ontplooing van toegevoegde-waarde lP-dienste. Die vergrote aanvraag en eksplosiewe uitbreiding van die Internet benodig betroubare, voorspelbare en bestendige netwerkprestasie. Hierdie tesis ondersoek die funksionaliteit, prestasie en implementering van die MPLS(multiprotokoletiketskakel)- meganismes om die verwagte pakketvertraging te minimeer. Ons bespreek 'n doeltreffende algoritme vir vloei-afwyking (EFDA) wat 'n klein hoeveelheid vloei toewys uit die versameling van roetes wat elke OT(oorsprong-teiken)- paar verbind aan die kortste pad wat die OT-paar koppel. Die vloei word toegewys sodanig dat die netwerk se gemiddelde pakketvertraging geminimeer word. Bellman se algoritme word gebruik om die kortste roetes tussen alle OT-pare te bepaal. Die tesis bespreek die probleem van die bepaling van roetes tussen 'n OT-paar en die toewysing van kapasiteite aan sulke roetes. Die EFDA-algoritme bepaal die globale minimum iteratief. Ons gebruik ook optimale vloeie vir die berekening van die optimale skakelkapasiteite in beide enkel- en multikoers netwerke. Die algoritme is toegepas op verskeie voorbeelde en op verskillende netwerkmodelle. Die skakelkapasiteite word aangewend om die prestasie van die EFDAalgoritme te evalueer en dit te vergelyk met die optimale oplossings verkry met verskillende aanvangstopologieë en tegnieke. Die resultate val binne klein koste-prestasie perke wat ook na aan die optimale oplossing lê.
APA, Harvard, Vancouver, ISO, and other styles
16

Kotrč, Václav. "Napěťové reference v bipolárním a CMOS procesu." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2015. http://www.nusl.cz/ntk/nusl-221111.

Full text
Abstract:
This diploma thesis deals with precise design of Brokaw BandGap voltage reference comparing with MOS references. There is STEP BY STEP separation and analysis of proposed devices, using Monte Carlo analysis. There are also presented the methods for achieving a lower deviation of the output voltage for yielding device, which needs no trimming.
APA, Harvard, Vancouver, ISO, and other styles
17

Kutišová, Kristýna. "Využití controllingu v podniku." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2017. http://www.nusl.cz/ntk/nusl-319460.

Full text
Abstract:
This Master's thesis is focused on dealing with cost variances (deviations) and using of standard costing method. The main intent is to analyse current state and make an improvement proposal of focusing on variables in company which is producing server PCs. This thesis is divided into three parts, which are theroretical part, analytic part and proposal part. The theoretical part is focused on characteristics of controlling, costs, calculation methods, especially standard costing method and focusing on variances. In next two parts is made the analysis of current state and then are suggested some improvement proposals.
APA, Harvard, Vancouver, ISO, and other styles
18

Monastyrskyi, Andrii. "Resonance ultrasonic vibrations and photoluminescence mapping for crack detection in crystalline silicon wafers and solar cells." [Tampa, Fla] : University of South Florida, 2008. http://purl.fcla.edu/usf/dc/et/SFE0002779.

Full text
APA, Harvard, Vancouver, ISO, and other styles
19

Zhang, Henry, and Alex Sahlman. "IT-bubblans inverkan på den amerikanska aktiemarknadens volatilitet." Thesis, Södertörns högskola, Institutionen för samhällsvetenskaper, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:sh:diva-22444.

Full text
Abstract:
Syfte: Syftet med denna studie var att se hur och varför volatiliteten påverkades i DJIA, S&P 500 och NASDAQ Composite under IT-bubblan. Metod: Års- och månadsvolatiliteten för DJIA, S&P 500 och NASDAQ Composite har beräknats under 1995-2004 med hjälp av data från Yahoo Finance. Empiri: Resultatet visar att volatiliteten var väsentligt högre i NASDAQ Composite än vad den var i S&P 500 och DJIA som i sin tur höll en liknande volatilitet i förhållande till varandra. Analys: I analysen framträdde det att volatiliteten blev väsentligt högre i samband med att bubblan sprack under maj 2000 fram till dess att paniken lade sig kort efter maj 2002. Det fanns en hög överensstämmelse mellan denna rapport och övriga tidigare studier. Teorierna var mestadels väl applicerbara. Slutsats: Volatiliteten för DJIA, S&P 500 och NASDAQ Composite var som högst mellan 2000 och 2002 under undersökningsperioden 1995-2004. IT-bubblan uppstod samt sprack till följd av irrationellt investeringsbeteende bland investerarna på aktiemarknaden och paniken som uppstod efteråt gjorde att volatiliteten på aktiemarknaden höll sig förhållandevis hög fram tills den lade sig kort efter maj 2002. NASDAQ Composite hade högst volatilitet till följd av IT-bubblan medan DJIA och S&P 500 hade likvärdig volatilitet. Samtliga index följde ett liknande mönster, detta var troligtvis på grund av att företag från NASDAQ Composite kunde återfinnas i S&P 500 samt DJIA.
Purpose: The purpose of this thesis is to see how and why the volatility was affected in DJIA, S&P 500 and NASDAQ Composite during the Dot-com bubble. Method: The yearly and monthly volatility of DJIA, S&P 500 and NASDAQ Composite were computed with data from a period spanning 1995-2004, which were collected from Yahoo Finance. Empiricism: The results illustrate that the volatility was vastly higher in NASDAQ Composite than in DJIA and S&P 500 which in turn yielded a comparable volatility in relation to each other. Analysis: The analysis extracted the fact that the volatility rose considerably after the bubble burst during May 2000 and started waning after the panic died out circa May 2002. There were a relatively high harmony between the results of this report and the earlier studies which it was compared to. Conclusion: The volatility for DJIA, S&P 500 and NASDAQ 500 was higher between 200 and 2002 than during the rest of the observed period. The Dot-com bubble arose due to irrational investment behavior among investors and the panic which arose afterwards contributed to the increasing volatility which maintained a high level until it subsided after May 2002. NASDAQ Composite had the highest volatility during the Dot-com bubble while DJIA and S&P 500 had a similar volatility. All indexes followed a similar pattern, this was probably due to that companies from NASDAQ Composite reasonably should be found in S&P 500 and DJIA.
APA, Harvard, Vancouver, ISO, and other styles
20

Barkino, Iliam, and Öman Marcus Rivera. "Enough is Enough : Sufficient number of securities in an optimal portfolio." Thesis, Uppsala universitet, Företagsekonomiska institutionen, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-298462.

Full text
Abstract:
This empirical study has shown that optimal portfolios need approximately 10 securities to diversify away the unsystematic risk. This challenges previous studies of randomly chosen portfolios which states that at least 30 securities are needed. The result of this study sheds light upon the difference in risk diversification between random portfolios and optimal portfolios and is a valuable contribution for investors. The study suggests that a major part of the unsystematic risk in a portfolio can be diversified away with fewer securities by using portfolio optimization. Individual investors especially, who usually have portfolios consisting of few securities, benefit from these results. There are today multiple user-friendly software applications that can perform the computations of portfolio optimization without the user having to know the mathematics behind the program. Microsoft Excel’s solver function is an example of a well-used software for portfolio optimization. In this study however, MATLAB was used to perform all the optimizations. The study was executed on data of 140 stocks on NASDAQ Stockholm during 2000-2014. Multiple optimizations were done with varying input in order to yield a result that only depended on the investigated variable, that is, how many different stocks that are needed in order to diversify away the unsystematic risk in a portfolio.

Osäker på examinatorns namn, tog namnet på den person som skickade mejl om betyg.

APA, Harvard, Vancouver, ISO, and other styles
21

Csörgö, Tomáš. "Meranie výkonnosti portfólia." Master's thesis, Vysoká škola ekonomická v Praze, 2013. http://www.nusl.cz/ntk/nusl-195516.

Full text
Abstract:
The goal of the master thesis is to analyze portfolio performance. The theoretical part of the thesis describes risk, portfolio performance measurement, investment funds, theory of portfolio. The analysis of portfolio performance is measured by different portfolio measurement tools.
APA, Harvard, Vancouver, ISO, and other styles
22

Paul, Leroy W. "The concurrent development scheduling problem (CDSP)." [Tampa, Fla.] : University of South Florida, 2005. http://purl.fcla.edu/fcla/etd/SFE0001411.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Bradley, Jay. "Reinforcement learning for qualitative group behaviours applied to non-player computer game characters." Thesis, University of Edinburgh, 2010. http://hdl.handle.net/1842/4784.

Full text
Abstract:
This thesis investigates how to train the increasingly large cast of characters in modern commercial computer games. Modern computer games can contain hundreds or sometimes thousands of non-player characters that each should act coherently in complex dynamic worlds, and engage appropriately with other non-player characters and human players. Too often, it is obvious that computer controlled characters are brainless zombies portraying the same repetitive hand-coded behaviour. Commercial computer games would seem a natural domain for reinforcement learning and, as the trend for selling games based on better graphics is peaking with the saturation of game shelves with excellent graphics, it seems that better artificial intelligence is the next big thing. The main contribution of this thesis is a novel style of utility function, group utility functions, for reinforcement learning that could provide automated behaviour specification for large numbers of computer game characters. Group utility functions allow arbitrary functions of the characters’ performance to represent relationships between characters and groups of characters. These qualitative relationships are learned alongside the main quantitative goal of the characters. Group utility functions can be considered a multi-agent extension of the existing programming by reward method and, an extension of the team utility function to be more generic by replacing the sum function with potentially any other function. Hierarchical group utility functions, which are group utility functions arranged in a tree structure, allow character group relationships to be learned. For illustration, the empirical work shown uses the negative standard deviation function to create balanced (or equal performance) behaviours. This balanced behaviour can be learned between characters, groups and also, between groups and single characters. Empirical experiments show that a balancing group utility function can be used to engender an equal performance between characters, groups, and groups and single characters. It is shown that it is possible to trade some amount of quantitatively measured performance for some qualitative behaviour using group utility functions. Further experiments show how the results degrade as expected when the number of characters and groups is increased. Further experimentation shows that using function approximation to approximate the learners’ value functions is one possible way to overcome the issues of scale. All the experiments are undertaken in a commercially available computer game engine. In summary, this thesis contributes a novel type of utility function potentially suitable for training many computer game characters and, empirical work on reinforcement learning used in a modern computer game engine.
APA, Harvard, Vancouver, ISO, and other styles
24

Pinkava, Ondřej. "Optimalizace portfolia cenných papírů." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2008. http://www.nusl.cz/ntk/nusl-221686.

Full text
Abstract:
This dissertation deals with the securities portfolio optimization. After introducing the definitions, I try to explain the particular investment instruments with regard to returns and risks. The following part provides a theory which tells more about different market risks and returns on the final securities portfolio. Concerning these models the effective portfolio has been set up.
APA, Harvard, Vancouver, ISO, and other styles
25

Příhoda, Martin. "Určení prostorových vztahů jeřábové dráhy." Master's thesis, Vysoké učení technické v Brně. Fakulta stavební, 2013. http://www.nusl.cz/ntk/nusl-226357.

Full text
Abstract:
The thesis deals with design of suitable methods to determine space relations of the particular crane tracks. These methods are investigated within the accuracy analysis. Test measurements using the designed methods are described and their results are analysed and compared with the applicable standards.
APA, Harvard, Vancouver, ISO, and other styles
26

Hagenfors, Rafail Linnea. "Spoken Lingua Franca English in an International Church in Sweden : An investigation of communicative effectiveness and attitudes in relation to deviation from Standard English in SOS Church." Thesis, Stockholms universitet, Engelska institutionen, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-77287.

Full text
Abstract:
This study is an investigation of communicative effectiveness and attitudes in relation to deviation from Standard English in an international church in Stockholm. This church is an English as a Lingua Franca (ELF) setting as the congregation consists almost entirely of people who use English as a means of communication with people who do not share their own first language. The study is based on empirical data from both qualitative and quantitative methods. The spoken language was investigated by analyzing one transcribed sermon and through interviewing two speakers of American English. Also a survey was done with 26 members of the church, obtaining quantitative data as well as several comments from the respondents on their view of the usage of English in the sermons and in the church in general.  The results from the study showed as expected that there were a number of deviations from Standard English when ELF was used in the sermon. However, these caused little irritation and were judged not to cause much misunderstanding. The deviations that did cause some irritation among the respondents from the church were when the wrong word was used as well as when a word was pronounced incorrectly. The results indicated that there was little disturbance regarding the communicativeness and attitudes in connection to the spoken English in this ELF setting.
APA, Harvard, Vancouver, ISO, and other styles
27

Marstorp, Gustav. "Automated Control System for Dust Concentration Measurements Using European Standard Reference Method." Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-292583.

Full text
Abstract:
Most companies that have any type of combustion or other pollution process via emission to air needs to measure their emissions to ensure they are within legal boundaries. Among the different types of pollution measurements, one of the most common is dust concentration, also known as particle concentration. An important factor in dust concentration measurements is to ensure that the concentration of the measured dust is representative to the dust concentration in the emissions. This is measured in isokinetic deviation, defined as (vn 􀀀 vd)=vd, where vn is the velocity in the entry nozzle and vd the velocity in the duct. Methods of dust concentration measurements used today are dependent on manual tuning and sensor readings, and the isokinetic deviation is calculated after a test. The focus of this project was therefore to investigate how the process of dust concentration measurements using standard reference methods could be automated in the way that isokinetic sampling is controlled and regulated by an automated control system in real time. Pressures, temperatures and sampled gas volume were quantized. A PIDcontroller was designed, implemented and tested. The PID-controller took the differential pressure between the inside of the entry nozzle and the duct, called zero pressure, as input. The system was tested in a laboratory environment by letting a radial fan create a flow, and thus create a zero pressure of -60 Pa, meaning that the pressure in the duct was 60 Pa greater than the pressure inside the entry nozzle. The PID-controller was then enabled and ran for five minutes. The result showed that the PID-controller managed to control the system to the reference point in less than 50 seconds for entry nozzles of diameters 6 mm, 8 mm, 10 mm and 12 mm. The results of the isokinetic deviations were -12 %, -5 %, -6 % and -4 % for entry nozzles with diameters 6 mm, 8 mm, 10 mm and 12 mm respectively. This is higher than the accepted values according to the European standard, which allows deviations in the interval -5%to 15%. However, these tests ran for relatively short time periods and started with large deviations which made it difficult to reach an isokinetic deviaiton in the accepted interval. Possible improvements could be to include the real time isokinetic deviation in the PID-controller, this would make it possible to change the reference value of the zero pressure in real time and guarantee isokinetic deviations in the accepted interval, even in extraordinary situations.
EU-regler ställer krav på anläggningar att kontrollera och begränsa sina utsläpp av stoft enligt EU standard 13284-1:2017. Vid en stoftmätning måste det tas hänsyn till många parametrar, där en av de viktigaste parametrarna är att provtagningen ska utföras isokinetiskt. Isokinetisk provtagning innebär att hastigheten i kanalen (skorstenen) är samma som i sonden där provgasen sugs ut. Dagens metoder för stoftmätning förlitar sig på manuella inställningar och den isokinetiska avvikelsen beräknas efter ett test. Det resulterade i frågeställnigen hur en automatiserad metod för bestämning av masskoncentration av stoft kan utformas så att den isokinetiska avvikelsen beräknas i realtid. Tryck, temperatur och gasvolym kvantiserades från analoga sensorer och kommunicerades till en mikrokontroller med det seriella protokollet I2C. En PID-reglator designades, implementerades och testades. PID-regulatorn tog tryckskillnaden mellan kanal och sond som insignal. Utsignalen från PID-regulatorn var en spänning som via en motordriven ventil kontrollerade inflödet i munstycket. Systemet testades i laborativ miljö genom att låta en fläkt skapa ett flöde tills den uppmätta tryckskillnaden mellan sond och kanal var -60 Pa. Därefter aktiverades PID-regulatorn och testet pågick sedan i fem minuter. Testet utfördes för munstycken med diameterna 6 mm, 8 mm, 10 mm och 12 mm. Resultatet visade att PID-regulatorn styrde systemet till referenspunkten på mindre än 50 sekunder för samtliga diametrar på munstyckena. De isokinetiska avvikelserna (skillnaden i hastighet mellan munstycke och kanal) beräknades till -12 %, -5 %, -6 % och -4 % för munstyckena 6 mm, 8 mm, 10 mm och 12 mm. I två av fallen var det högre än det accepterade värdet enligt EU standarden som tillåter avvikelser inom intervallet -5 % till 15 %. Det kan förklaras av att testen utfördes under en relativ kort tidsperiod och startades med stora avvikelser. Regulatorn skulle dock kunna förbättras genom att använda testets aktuella isokinetiska avvikelse och med den informationen bestämma systemets referenspunkt. Det skulle göra det möjligt att kompensera för tidigare avvikelser och på det sättet uppnå isokinetiska avvikelser inom tillåtet intervall även för extremfall.
APA, Harvard, Vancouver, ISO, and other styles
28

Kanyongo, Gibbs Y. "Using Large-Scale Datasets to Teach Abstract Statistical Concepts: Sampling Distribution." Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2012. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-82613.

Full text
APA, Harvard, Vancouver, ISO, and other styles
29

Dyverfeldt, Petter. "Estimation of Turbulence using Magnetic Resonance Imaging." Thesis, Linköping University, Department of Biomedical Engineering, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-7448.

Full text
Abstract:

In the human body, turbulent flow is associated with many complications. Turbulence typically occurs downstream from stenoses and heart valve prostheses and at branch points of arteries. A proper way to study turbulence may enhance the understanding of the effects of stenoses and improve the functional assessment of damaged heart valves and heart valve prostheses.

The methods of today for studying turbulence in the human body lack in either precision or speed. This thesis exploits a magnetic resonance imaging (MRI) phenomenon referred to as signal loss in order to develop a method for estimating turbulence intensity in blood flow.

MRI measurements were carried out on an appropriate flow phantom. The turbulence intensity results obtained by means of the proposed method were compared with previously known turbulence intensity results. The comparison indicates that the proposed method has great potential for estimation of turbulence intensity.

APA, Harvard, Vancouver, ISO, and other styles
30

Mattei, Lisa Marie. "Effects of Subglottic Stenosis and Cricotracheal Resection on Voice Production in Women." BYU ScholarsArchive, 2016. https://scholarsarchive.byu.edu/etd/6231.

Full text
Abstract:
Subglottic stenosis (SGS) is a narrowing of the airway in the region of the cricoid cartilage below the vocal folds and above the tracheal rings. Individuals with SGS experience difficulty breathing at rest and during exertion both of which become increasingly difficult with the level of stenosis severity. Some individuals also experience negative voice changes. Individuals whose stenoses significantly impact breathing generally require medical procedures or surgery, either balloon dilation or cricotracheal resection (CTR). CTR has been shown to improve patients' ability to breathe, but it can also result in permanent vocal changes. Alternatively, balloon dilation results in similar breathing improvements but for a relatively short period of time. Many studies have been published on the effectiveness of CTR; however, only a few have examined the effects of CTR on vocal production. The purpose of this study is to quantify the acoustic and auditory-perceptual features of subglottic stenosis and examine possible acoustic and auditory-perceptual changes in voice production following a revised CTR aimed to minimize voice impact in a group of women. A retrospective chart review identified women with idiopathic SGS who received revised CTR at The University of Utah Voice Disorders Center between 2008 and 2014. Presurgical and postsurgical groups included patients with both pre and post recordings (n = 11) as well as patients with only pre (n = 6) or post (n = 9) recordings. Acoustic quantification of voice signal periodicity, as well as cepstral, spectral, and fundamental frequency (F0) analyses were performed. Auditory-perceptual ratings of overall quality and monotonicity were performed. Cross-sectional and pre-post surgery analyses were completed. Aggregate analyses revealed that both pre and posttreatment SGS patients demonstrated voice disorders in the mild to moderate severity range. Pre-post comparisons indicated no significant voice change after surgery. Mean fundamental frequency decreased from 215 Hz (SD = 40 Hz) to 201 Hz (SD = 65 Hz). Voice disorder severity based on the cepstral spectral index of dysphoniaTM for sustained vowels decreased (i.e., improved) from 41 (SD = 41) to 25 (SD = 21) points. Semitone standard deviation (2.2 semitones) was equivalent from pretreatment to posttreatment. Auditory-perceptual ratings demonstrated similar results. These preliminary results indicate that the revised CTR procedure is promising in minimizing adverse voice effects. Future research is needed to determine causative factors for pretreatment voice disorders, as well as to optimize treatments in this population.
APA, Harvard, Vancouver, ISO, and other styles
31

Francisco, Sérgio Luiz. "Abordagem do ensino de desvio padrão em livros didáticos." Universidade Federal de São Carlos, 2013. https://repositorio.ufscar.br/handle/ufscar/5955.

Full text
Abstract:
Made available in DSpace on 2016-06-02T20:29:25Z (GMT). No. of bitstreams: 1 5559.pdf: 4011578 bytes, checksum: da9d129d8556cec59dfe06e20b028d0b (MD5) Previous issue date: 2013-09-30
Financiadora de Estudos e Projetos
Teaching Statistics as part of the axis Treatment Information is seen by PCN (Parâmetros Curriculares Nacionais) as a tool for the interpretation of the world that surrounds the pupil, in view of the diversity of areas using the typical elements of Statistics (such as tables, graphs, etc.) in the dissemination of information. Hence, the need for teaching this subject should strive for contextualization of its contents and provide information for decision-making. The objective of this work was to present a didactic sequence that judges be more appropriate as the significance for the student with regard to the teaching of some of these elements of statistics, specifically the Standard Deviation, the application of an exercise in a class of 3rd. year of high school public school education in the city of Jahu. Together, adds that, after analysis of seven textbooks indicated by PNLEM programs (Programa Nacional do Livro Didático do Ensino Médio) and PNLD ((Programa Nacional do Livro Didático) as to approach the teaching of statistics, it was found that in one there is the option for this instructional sequence in which the settings of the approach is associated with the Normal Curve probabilities related to a frequency distribution. Expected to lead teachers and mentors to reflect on the teaching of this subject, both in performance in the classroom or guidance for professionals in education, as in the choice of books or other educational materials.
O ensino de Estatística como parte do eixo de Tratamento de Informações é visto pelos PCNs (Parâmetros Curriculares Nacionais) como uma ferramenta para a interpretação do mundo que cerca o aluno, tendo em vista a diversidade de áreas que utilizam os elementos típicos da Estatística (como tabelas, gráficos, etc) na divulgação de informações. Daí, a necessidade de que o ensino deste tema deva primar pela contextualização de seus conteúdos e fornecer subsídios para as tomadas de decisões. Assim, o objetivo deste trabalho foi apresentar uma sequência didática que julga ser mais adequada quanto à significância para o aluno no que diz respeito ao ensino de alguns desses elementos da Estatística, especificamente o Desvio Padrão, com a aplicação de um exercício em uma classe da 3ª. série do Ensino Médio da rede pública estadual de educação, na cidade de Jahu. Juntamente, acrescenta-se que, após a análise de sete livros didáticos indicados pelos programas PNLEM (Programa Nacional do Livro Didático do Ensino Médio) e PNLD (Programa Nacional do Livro Didático), quanto à abordagem do ensino de Estatística, verificou-se que em apenas um há a opção por essa sequência didática, na qual a abordagem das definições da Curva Normal é associada às Probabilidades referentes a uma distribuição de frequências. Espera-se levar professores e orientadores pedagógicos à reflexão sobre o ensino desse tema, tanto na atuação em sala de aula ou em orientações para profissionais da área de educação, como, na escolha de livros ou quaisquer outros materiais didáticos.
APA, Harvard, Vancouver, ISO, and other styles
32

Choudhury, Jenny, and Mete Pektas. "Etiska och traditionella fonders avkastning : En jämförande studie mellan etiska och traditionella fonder." Thesis, Södertörns högskola, Institutionen för samhällsvetenskaper, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:sh:diva-18591.

Full text
Abstract:
Syfte: Syftet med uppsatsen är att utreda huruvida avkastningar mellan etiska aktiefonder och traditionella aktiefonder är likvärdiga. Vidare avser uppsatsen att klargöra hur etiska fonder definieras ur ett teoretiskt perspektiv med utgångspunkt i rådande forskning.  Metod: Studien är av kvantitativ karaktär och utfördes med hjälp av fonddata inhämtad från respektive storbank och Morningstar. Det kvantitativa innehållet består av fondernas årliga avkastningar. Undersökningsperioden sträcker sig från december 2008 till december 2012.  Teori: Beta, Sharpekvot och Modern Portföljteori.  Slutsats: Studiens slutsats påvisar inga större skillnader mellan de etiska och de traditionella fonderna sett till avkastning. Den traditionella fondgruppen var den som hade marginellt bättre avkastning.
APA, Harvard, Vancouver, ISO, and other styles
33

Huang, Bing. "Understanding Operating Speed Variation of Multilane Highways with New Access Density Definition and Simulation Outputs." Scholar Commons, 2012. http://scholarcommons.usf.edu/etd/4079.

Full text
Abstract:
Traffic speed is generally considered a core issue in roadway safety. Previous studies show that faster travel is not necessarily associated with an increased risk of being involved in a crash. When vehicles travel at the same speed in the same direction (even high speeds, as on interstates), they are not passing one another and cannot collide as long as they maintain the same speed. Conversely, the frequency of crashes increases when vehicles are traveling at different rates of speed. There is no doubt that the greater speed variation is, the greater the number of interactions among vehicles is, resulting in higher crash potential. This research tries to identify all major factors that are associated with speed variation on multilane highways, including roadway access density, which is considered to be the most obvious contributing factor. In addition, other factors are considered for this purpose, such as configuration of speed limits, characteristics of traffic volume, geometrics of roadways, driver behavior, environmental factors, etc. A microscopic traffic simulation method based on TSIS (Traffic Software Integrated System) is used to develop mathematical models to quantify the impacts of all possible factors on speed variation.
APA, Harvard, Vancouver, ISO, and other styles
34

Jomaa, Diala. "A data driven approach for automating vehicle activated signs." Doctoral thesis, Högskolan Dalarna, Datateknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:du-21504.

Full text
Abstract:
Vehicle activated signs (VAS) display a warning message when drivers exceed a particular threshold. VAS are often installed on local roads to display a warning message depending on the speed of the approaching vehicles. VAS are usually powered by electricity; however, battery and solar powered VAS are also commonplace. This thesis investigated devel-opment of an automatic trigger speed of vehicle activated signs in order to influence driver behaviour, the effect of which has been measured in terms of reduced mean speed and low standard deviation. A comprehen-sive understanding of the effectiveness of the trigger speed of the VAS on driver behaviour was established by systematically collecting data. Specif-ically, data on time of day, speed, length and direction of the vehicle have been collected for the purpose, using Doppler radar installed at the road. A data driven calibration method for the radar used in the experiment has also been developed and evaluated. Results indicate that trigger speed of the VAS had variable effect on driv-ers’ speed at different sites and at different times of the day. It is evident that the optimal trigger speed should be set near the 85th percentile speed, to be able to lower the standard deviation. In the case of battery and solar powered VAS, trigger speeds between the 50th and 85th per-centile offered the best compromise between safety and power consump-tion. Results also indicate that different classes of vehicles report differ-ences in mean speed and standard deviation; on a highway, the mean speed of cars differs slightly from the mean speed of trucks, whereas a significant difference was observed between the classes of vehicles on lo-cal roads. A differential trigger speed was therefore investigated for the sake of completion. A data driven approach using Random forest was found to be appropriate in predicting trigger speeds respective to types of vehicles and traffic conditions. The fact that the predicted trigger speed was found to be consistently around the 85th percentile speed justifies the choice of the automatic model.
APA, Harvard, Vancouver, ISO, and other styles
35

Macêdo, Guilherme Ribeiro de. "Análise da volatilidade de séries financeiras segundo a modelagem da família GARCH." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2009. http://hdl.handle.net/10183/15598.

Full text
Abstract:
O conhecimento do risco de ativos financeiros é de fundamental importância para gestão ativa de carteiras, determinação de preços de opções e análise de sensibilidade de retornos. O risco é medido através da variância estatística e há na literatura diversos modelos econométricos que servem a esta finalidade. Esta pesquisa contempla o estudo de modelos determinísticos de volatilidade, mais especificamente os modelos GARCH simétricos e assimétricos. O período de análise foi dividido em dois: de janeiro de 2000 à fevereiro de 2008 e à outubro de 2008. Tal procedimento foi adotado procurando identificar a influência da crise econômica originada nos EUA nos modelos de volatilidade. O setor escolhido para o estudo foi o mercado de petróleo e foram escolhidas as nove maiores empresas do setor de acordo com a capacidade produtiva e reservas de petróleo. Além destas, foram modeladas também as commodities negociadas na Bolsa de Valores de Nova York: o barril de petróleo do tipo Brent e WTI. A escolha deste setor deve-se a sua grande importância econômica e estratégica para todas as nações. Os resultados encontrados mostraram que não houve um padrão de modelo de volatilidade para todos os ativos estudados e para a grande maioria dos ativos, há presença de assimetria nos retornos, sendo o modelo GJR (1,1) o que mais prevaleceu, segundo a modelagem pelo método da máxima verossimilhança. Houve aderência, em 81% dos casos, dos ativos a um determinado modelo de volatilidade, alterando apenas, como eram esperados, os coeficientes de reatividade e persistência. Com relação a estes, percebe-se que a crise aumentou os coeficientes de reatividade para alguns ativos. Ao se compararem as volatilidades estimadas de curto prazo, percebe-se que o agravamento da crise introduziu uma elevação média de 265,4% em relação ao período anterior, indicando um aumento substancial de risco. Para a volatilidade de longo prazo, o aumento médio foi de 7,9%, sugerindo que os choques reativos introduzidos com a crise, tendem a ser dissipados ao longo do tempo.
The knowledge of the risk of financial assets is of basic importance for active management of portfolios, determination of prices of options and analysis of sensitivity of returns. The risk is measured through the variance statistics and has in literature several econometrical models that serve to this purpose. This research contemplates the study of deterministic models of volatility, more specifically symmetrical and asymmetrical models GARCH. The period of analysis was divided in two: January of 2000 to the February of 2008 and the October of 2008. Such a proceeding was adopted trying to identify the influence of the economic crisis given rise in U.S.A. in the volatility models. The sector chosen for the study was the oil market and had been chosen the nine bigger companies of the sector in accordance with the productive capacity and reserves of oil. Beyond these, there were modeled also the commodities negotiated in the Stock Exchange of New York: the barrel of oil of the types Brent and WTI. The choice of this sector is due to his great economical and strategic importance for all the nations. The results showed that there was no a standard of model of volatility for all the studied assets and for the majority of them, there is presence of asymmetry in the returns, being the model GJR (1,1) that more prevailed, according to the method of likelihood. There was adherence, in 81 % of the cases, of the assets to a determined model of volatility, altering only the coefficients of reactivity and persistence. Regarding these, it is realized that the crisis increased the coefficients of reactivity for some assets. In relation to the volatilities of short term, it is realized that the aggravation of the crisis introduced an elevation of 265,4% regarding the previous period, indicating a substantial increase of risk. In relation to the volatility of long term, the increase was 7,9 %, suggesting that the reactive shocks introduced with the crisis have a tendency to be dispersed along the time.
APA, Harvard, Vancouver, ISO, and other styles
36

Sjöstrand, Victor, and Kanstedt Albert Svensson. "Evaluation regarding the US fund market : A comparison between different US fund risk classes and their performance." Thesis, Linnéuniversitetet, Institutionen för ekonomistyrning och logistik (ELO), 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-104549.

Full text
Abstract:
The intent of this thesis is to investigate how US equity funds performance differ due to their standard deviation. In order to accomplish this study, we collected daily data for 99 US equity funds for the period 2011-2020 and divided the funds into three risk classification groups based on their standard deviation for the year 2011. The collected data was used to perform an CAPM regression and to calculate returns on a three-, five- and ten-year basis. The results for the regression and the returns for the funds was later presented as average values for the different risk classification groups. We then compared the average outcomes for the three risk classifications with each other and the index S&P 500. Our result showed that the index S&P 500 outperformed the three risk classification groups average returns for every time period. We also noticed that the difference between the average returns and the index got greater by time. We did not find any big differences between our risk classifications when it comes to their performance. Our regression analysis resulted in many negative alpha values indicating that S&P 500, as many previous studies claims, outperforms actively mutual funds. The conclusion is therefore that we could not show any evidence that the there is a major different in performance between our risk groups but also that it is difficult for fund managers to outperform index.
APA, Harvard, Vancouver, ISO, and other styles
37

Вірченко, В. В. "Статистичне моделювання статичної характеристики пневматичного перетворювача." Master's thesis, Сумський державний університет, 2019. http://essuir.sumdu.edu.ua/handle/123456789/76458.

Full text
Abstract:
Розроблено алгоритм і комп'ютерну програму мовою програмування С++ для знаходження залежностей, математичного сподівання вихідного параметру перетворювача та середньо квадратичного відхилення, від величини переміщення заслінки для декількох значень інтервалів, в яких змінюється тиск живлення, що дозволяють обґрунтувати вимоги до системи автоматичного регулювання живлення перетворювача.
APA, Harvard, Vancouver, ISO, and other styles
38

Křižka, Adam. "Diverzifikace portfolia prostřednictvím investic do burzovních indexů." Master's thesis, Vysoké učení technické v Brně. Fakulta podnikatelská, 2020. http://www.nusl.cz/ntk/nusl-414481.

Full text
Abstract:
The diploma thesis focuses on the design of suitable stock exchange indices for portfolio diversification. The essence and principle of functioning of financial markets and investment funds is presented. According to suitable indicators, stock exchange indices are analyzed and compared with the market. Suitable indices are verified by means of correlation analysis and subsequently recommended to diversify the portfolios of investment funds managed through the investment company.
APA, Harvard, Vancouver, ISO, and other styles
39

Echiejile, Faith. "Analysis of Monthly Suspended Sediment Load in Rivers and Streams Using Linear Regression and Similar Precipitation Data." Youngstown State University / OhioLINK, 2021. http://rave.ohiolink.edu/etdc/view?acc_num=ysu1629203139818238.

Full text
APA, Harvard, Vancouver, ISO, and other styles
40

Mossberg, Anneli. "Improving the Modeling Framework for DCE-MRI Data in Hepatic Function Evaluation." Thesis, Linköpings universitet, Institutionen för medicin och hälsa, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-94306.

Full text
Abstract:
Background Mathematical modeling combined with prior knowledge of the pharmacokinetics of the liver specific contrast agent Gd-EOB-DTPA has the potential to extract more information from Dynamic Contrast Enhanced Magnetic Resonance Imaging (DCE-MRI) data than previously possible. The ultimate goal of that work is to create a liver model that can describe DCE-MRI data well enough to be used as a diagnostic tool in liver function evaluation. Thus far this goal has not been fully reached and there is still some work to be done in this area. In this thesis, an already existing liver model will be implemented in the software Wolfram SystemModeler (WSM), the corresponding modeling framework will be further developed to better handle the temporally irregular sampling of DCE-MRI data and finally an attempt will be made to determine an optimal sampling design in terms of when and how often to collect images. In addition to these original goals, the work done during this project revealed two more issues that needed to be dealt with. Firstly, new standard deviation (SD) estimation methods regarding non-averaged DCE-MRI data were required in order to statistically evaluate the models. Secondly, the original model’s poor capability of describing the early dynamics of the system led to the creation of an additional liver model in attempt to model the bolus effect. Results The model was successfully implemented in WSM whereafter regional optimization was implemented as an attempt to handle clustered data. Tests on the available data did not result in any substantial difference in optimization outcome, but since the analyses were performed on only three patient data sets this is not enough to disregard the method. As a means of determining optimal sampling times, the determinant of the inverse Fisher Information Matrix was minimized, which revealed that frequent sampling is most important during the initial phase (~50-300 s post injection) and at the very end (~1500-1800 s). Three new means of estimating the SD were proposed. Of these three, a spatio-temporal SD was deemed most reasonable under the current circumstances. If a better initial fit is achieved, yet another method of estimating the variance as an optimization parameter might be implemented.    As a result of the new standard deviation the model failed to be statistically accepted during optimizations. The additional model that was created to include the bolus effect, and therefore be better able to fit the initial phase data, was also rejected. Conclusions The value of regional optimization is uncertain at this time and additional tests must be made on a large number of patient data sets in order to determine its value. The Fisher Information Matrix will be of great use in determining when and how often to sample once the model has achieved a more acceptable model fit in both the early and the late phase of the system. Even though the indications that it is important to sample densely in the early phase is rather intuitive due to a poor model fit in that region, the analyses also revealed that the final observations have a relatively high impact on the model prediction error. This was not previously known. Hence, an important measurement of how suitable the sampling design is in terms of the resulting model accuracy has been suggested. The original model was rejected due to its inability to fit the data during the early phase. This poor initial fit could not be improved enough by modelling the bolus effect and so the new implementation of the model was also rejected. Recommendations have been made in this thesis that might assist in the further development the liver model so that it can describe the true physiology and behaviour of the system in all phases. Such recommendations include, but are not limited to, the addition of an extra blood plasma compartment, a more thorough modelling of the spleen’s uptake of the contrast agent and a separation of certain differing signals that are now averaged.
APA, Harvard, Vancouver, ISO, and other styles
41

Elman, Beatrice, and Sebastian Pers. "Corporate Social Responsibility och riskpåverkan : En studie av det sociala ansvarstagandets effekt på risk i Svenska börsbolag." Thesis, Södertörns högskola, Institutionen för samhällsvetenskaper, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:sh:diva-30714.

Full text
Abstract:
This study uses a quantitative method that aims to investigate the relationship between corporate social responsibility (CSR) and firm risk within Swedish public companies. Despite previous research at Anglo-Saxon companies with similar results, authors found cause for further investigation. Authors identified differences in the Swedish context that could affect the earlier found negative relation between CSR and firm risk, thereby legitimizing further examination. The research is built on secondary data collected from Nasdaq, Morningstar, Orbis and the CSRhub database. Through theory of relevance and current research, it develops a hypothesis which states that as CSR increases, firm risk is reduced in accordance with previous research. Testing was done with Pearsons bivariate correlation table and a multivariate regression analysis, controlling for various firm characteristics. The study found no connection between market risk and CSR, but could not determine whether a relationship between CSR and total risk exists within the population, only partly rejecting the hypothesis. The study raises attention as to how the relation between CSR and risk could be different in a context outside the typical Anglo-Saxon population. It could also be used as a base to further research on the cause to the lack of relation between CSR and market risk, in this study’s particular population.
APA, Harvard, Vancouver, ISO, and other styles
42

Šebek, Miloš. "Stanoveni kavitace na ventilu z poklesu průtočnosti a z vysokofrekvenčních pulsací tlaku." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2010. http://www.nusl.cz/ntk/nusl-229051.

Full text
Abstract:
Main issue of this master´s thesis deals with high-frequency pulsations caused by cavitating hydraulic components (in this case nozzle and throttle valve). In first measurement on the nozzle was not a high-frequency sensor set in the way, so the evaluation was incorrect. After re-setting of the way was the nozzle measured again and the pulsations were evaluated correct this time. During the last measurement was the sensor located behind the throttle valve, which was with gradual opening and cavitation treatment measured. Resulting dependencies worked out on time dependence. A special mathematical method, Fourier transformation, was used. It transformed pressure amplitudes into frequency dependence. Evaluation of dependecies is the basic step for frequency band assesment, in which the cavitation on particular components happens.
APA, Harvard, Vancouver, ISO, and other styles
43

Kotěšovcová, Jana. "Analýza výkonnosti a kredibility tuzemských penzijních fondů." Doctoral thesis, Vysoká škola ekonomická v Praze, 2008. http://www.nusl.cz/ntk/nusl-163016.

Full text
Abstract:
This dissertation work focuses on evaluation of the performance and credibility of domestic pension funds. It includes information about pension systems in six selected countries in the world, specifically Chile, Hungary, Switzerland, Poland, Sweden and Slovakia, and culminates with a proposal for pension reform in the Czech Republic. The evaluation of the performance of pension funds is based on experience with measuring performance in twenty-three countries of the world processed for the OECD and cites original basis materials for proposals for the regulation of pension funds in the Czech Republic. Setting of indicators is preceded by an analysis of measurements of performance by the Association of Pension Funds in the Czech Republic and the Research Institute of Labour and Social Affairs. The most important proposed indicators include the actual result in relation to the participants' funds, which captures the influence of changes in the market values of financial assets, and the indicator of competence covers obligations, reflecting the influence of changes in costs during subsequent periods, which include yet undistributed commission for mediators of pension insurance. For a comparison of benefits and risks, mutual funds denominated in CZK were selected, which were verified as a suitable alternative to long-term saving. The method of evaluation of pension funds itself is based on explanation, evaluation and selection of favourable indicators for determining the rating of domestic pension funds, which clients of pension funds could use as a method for evaluating the credibility and benefits of a specific domestic pension fund for the purpose of securing it in retirement age.
APA, Harvard, Vancouver, ISO, and other styles
44

Kepák, Petr. "Identifikace pauz v rušeném řečovém signálu." Master's thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2011. http://www.nusl.cz/ntk/nusl-218837.

Full text
Abstract:
The basic problem of speech is a complete separation of the natural noise which arise from correct articulation of voiced and unvoiced consonants from noise and disturbance environment. Objective of this master’s thesis is to find an effective method that could identify the pauses without speech activity, which can identify the properties of noise and disturbance. Once the noise is correctly identified, it is already possible to use different methods for its removal. The master’s thesis describes two methods of pauses identification. These methods are programmed in Matlab and tested on nine speech recordings. Methods analysis of the results was performed using the ROC (Receiver Operating Characteristic) curves. In the end are summarized results analysis of created methods.
APA, Harvard, Vancouver, ISO, and other styles
45

Nutter, David B. "Sound Absorption and Sound Power Measurements in Reverberation Chambers Using Energy Density Methods." Diss., CLICK HERE for online access, 2006. http://contentdm.lib.byu.edu/ETD/image/etd1546.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Jomaa, Diala. "The Optimal trigger speed of vehicle activated signs." Licentiate thesis, Högskolan Dalarna, Mikrodataanalys, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:du-17538.

Full text
Abstract:
The thesis aims to elaborate on the optimum trigger speed for Vehicle Activated Signs (VAS) and to study the effectiveness of VAS trigger speed on drivers’ behaviour. Vehicle activated signs (VAS) are speed warning signs that are activated by individual vehicle when the driver exceeds a speed threshold. The threshold, which triggers the VAS, is commonly based on a driver speed, and accordingly, is called a trigger speed. At present, the trigger speed activating the VAS is usually set to a constant value and does not consider the fact that an optimal trigger speed might exist. The optimal trigger speed significantly impacts driver behaviour. In order to be able to fulfil the aims of this thesis, systematic vehicle speed data were collected from field experiments that utilized Doppler radar. Further calibration methods for the radar used in the experiment have been developed and evaluated to provide accurate data for the experiment. The calibration method was bidirectional; consisting of data cleaning and data reconstruction. The data cleaning calibration had a superior performance than the calibration based on the reconstructed data. To study the effectiveness of trigger speed on driver behaviour, the collected data were analysed by both descriptive and inferential statistics. Both descriptive and inferential statistics showed that the change in trigger speed had an effect on vehicle mean speed and on vehicle standard deviation of the mean speed. When the trigger speed was set near the speed limit, the standard deviation was high. Therefore, the choice of trigger speed cannot be based solely on the speed limit at the proposed VAS location. The optimal trigger speeds for VAS were not considered in previous studies. As well, the relationship between the trigger value and its consequences under different conditions were not clearly stated. The finding from this thesis is that the optimal trigger speed should be primarily based on lowering the standard deviation rather than lowering the mean speed of vehicles. Furthermore, the optimal trigger speed should be set near the 85th percentile speed, with the goal of lowering the standard deviation.
APA, Harvard, Vancouver, ISO, and other styles
47

Nguyen, Huu-Nghi. "Estimation de l’écart type du délai de bout-en-bout par méthodes passives." Thesis, Lyon, 2017. http://www.theses.fr/2017LYSE1044/document.

Full text
Abstract:
Depuis l'avènement du réseau Internet, le volume de données échangées sur les réseaux a crû de manière exponentielle. Le matériel présent sur les réseaux est devenu très hétérogène, dû entre autres à la multiplication des "middleboxes" (parefeux, routeurs NAT, serveurs VPN, proxy, etc.). Les algorithmes exécutés sur les équipements réseaux (routage, “spanning tree”, etc.) sont souvent complexes, parfois fermés et propriétaires et les interfaces de supervision peuvent être très différentes d'un constructeur/équipement à un autre. Ces différents facteurs rendent la compréhension et le fonctionnement du réseau complexe. Cela a motivé la définition d'un nouveau paradigme réseaux afin de simplifier la conception et la gestion des réseaux : le SDN (“Software-defined Networking”). Il introduit la notion de contrôleur, qui est un équipement qui a pour rôle de contrôler les équipements du plan de données. Le concept SDN sépare donc le plan de données chargés de l'acheminement des paquets, qui est opéré par des équipements nommés virtual switches dans la terminologie SDN, et le plan contrôle, en charge de toutes les décisions, et qui est donc effectué par le contrôleur SDN. Pour permettre au contrôleur de prendre ses décisions, il doit disposer d'une vue globale du réseau. En plus de la topologie et de la capacité des liens, des critères de performances comme le délai, le taux de pertes, la bande passante disponible, peuvent être pris en compte. Cette connaissance peut permettre par exemple un routage multi-classes, ou/et garantir des niveaux de qualité de service. Les contributions de cette thèse portent sur la proposition d'algorithmes permettant à une entité centralisée, et en particulier à un contrôleur dans un cadre SDN, d'obtenir des estimations fiables du délai de bout-en-bout pour les flux traversant le réseau. Les méthodes proposées sont passives, c'est-à-dire qu'elles ne génèrent aucun trafic supplémentaire. Nous nous intéressons tout particulièrement à la moyenne et l'écart type du délai. Il apparaît que le premier moment peut être obtenu assez facilement. Au contraire, la corrélation qui apparaît dans les temps d'attentes des noeuds du réseau rend l'estimation de l'écart type beaucoup plus complexe. Nous montrons que les méthodes développées sont capables de capturer les corrélations des délais dans les différents noeuds et d'offrir des estimations précises de l'écart type. Ces résultats sont validés par simulations où nous considérons un large éventail de scénarios permettant de valider nos algorithmes dans différents contextes d'utilisation
Since the early beginning of Internet, the amount of data exchanged over the networks has exponentially grown. The devices deployed on the networks are very heterogeneous, because of the growing presence of middleboxes (e.g., firewalls, NAT routers, VPN servers, proxy). The algorithms run on the networking devices (e.g., routing, spanning tree) are often complex, closed, and proprietary while the interfaces to access these devices typically vary from one manufacturer to the other. All these factors tend to hinder the understanding and the management of networks. Therefore a new paradigm has been introduced to ease the design and the management of networks, namely, the SDN (Software-defined Networking). In particular, SDN defines a new entity, the controller that is in charge of controlling the devices belonging to the data plane. Thus, in a SDN-network, the data plane, which is handled by networking devices called virtual switches, and the control plane, which takes the decisions and executed by the controller, are separated. In order to let the controller take its decisions, it must have a global view on the network. This includes the topology of the network and its links capacity, along with other possible performance metrics such delays, loss rates, and available bandwidths. This knowledge can enable a multi-class routing, or help guarantee levels of Quality of Service. The contributions of this thesis are new algorithms that allow a centralized entity, such as the controller in an SDN network, to accurately estimate the end-to-end delay for a given flow in its network. The proposed methods are passive in the sense that they do not require any additional traffic to be run. More precisely, we study the expectation and the standard deviation of the delay. We show how the first moment can be easily computed. On the other hand, estimating the standard deviation is much more complex because of the correlations existing between the different waiting times. We show that the proposed methods are able to capture these correlations between delays and thus providing accurate estimations of the standard deviation of the end-to-end delay. Simulations that cover a large range of possible scenariosvalidate these results
APA, Harvard, Vancouver, ISO, and other styles
48

Moutáfov, Ernesto, and Legrand Giovanni Perez. "Hög avkastning till låg risk : En jämförande studie mellan aktieportföljers innehåll och prestation." Thesis, Södertörns högskola, Institutionen för ekonomi och företagande, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:sh:diva-16863.

Full text
Abstract:
Syfte: Studera sju portföljer och notera den bästa typen av portfölj med högst avkastning till lägst risk. Metod: Sekundärdata är grunden för uträkning av samtliga portföljers avkastningar, risker och korrelation. Studien är deduktiv med kvantitativa inslag av kända teorier av nobelpristagare i ekonomisk vetenskap.  Slutsats: Studien visar att stora bolag i olika branscher är ett vinnande portföljinnehåll för denna studie. Stora bolags aktier har visat högre avkastning till lägre risk jämfört med små bolag under studiens tid då ekonomiska kriser drabbade marknaden. Den mest presterande portföljen var därför storbolagsportföljen. Vidare forskning: Längre tidsperspektiv och nya teorier som Jensens alfa samt Treynorkvot är av intresse för vidare forskning för att styrka vår slutsats.
Intention: To study seven portfolios and note the best type of portfolio with the maximum return at a minimum risk. Method: Secondary data is the basis for calculation of the total portfolio returns, risk and correlation. This study is deductive based using a quantitative method of world-known theories of Nobel laureates in economic sciences. Conclusion: The study shows that the best efficient portfolio contains large companies in different lines of business. Large companies' shares have higher returns at lower risk compared to small companies in circumstances to difficult economic situations globally. The best performed portfolio was the portfolio with large companies.                                       Further Research: Longer period of time study and a study of new theories such as Jensens Alfa and Tretnor ratio would be interesting for further research.
APA, Harvard, Vancouver, ISO, and other styles
49

Faria, Priscila Neves. "Avaliação de métodos para determinação do número ótimo de clusters em estudo de divergência genética entre acessos de pimenta." Universidade Federal de Viçosa, 2009. http://locus.ufv.br/handle/123456789/4018.

Full text
Abstract:
Made available in DSpace on 2015-03-26T13:32:05Z (GMT). No. of bitstreams: 1 texto completo.pdf: 688077 bytes, checksum: 369ec0145d58b4c3f2d93ab69403df95 (MD5) Previous issue date: 2009-01-19
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Many times, the interpretation of the results in cluster analysis is done subjectively, that is, through inspection on dendograms, since there are no objective criteria to identify the formed clusters. In face of such a problem, the present study aimed to: (1) find out an objective way to achieve the cut-point (optimal number of clusters) in a dendogram in order to help on taking the right decision; (2) work out index concepts such as Root Mean Square Standard Deviation (RMSSTD) and R Squared (RS), explaining the contribution of each one of them in determining the optimal number of cluster; (3) method application, aiming to identify divergent accessions that will be used on improvement programs. An alternative solution for this problem is to use the RMSSTD and RS which are calculated according to the number of variables among and within the clusters formed, characterizing an objective way to determine the optimal number. Another solution is achieved by using the RS. Some morphological characteristics of the forty nine accessions of the species Capsicum chinense Jacq. from the Germplasm Bank of Vegetables of the Federal University of Viçosa (Banco de Germoplasma de Hortaliças da Universidade Federal de Viçosa, Minas Gerais Brazil) were analyzed by means of cluster analysis. The accessions were clustered based on the proposed techniques and an optimal number of clusters was achieved. The 49 accessions analyzed were classified into only seven clusters according to the graph of the RMSSTD versus the number of clusters and the graph of the RS versus the number of clusters.
Muitas vezes, a interpretação dos resultados em análise de agrupamentos é feita de forma subjetiva, isto é, através da inspeção de dendrogramas. Isto se deve ao fato de haver dificuldade em se encontrar na literatura um critério objetivo de fácil aplicação para identificar o número ideal de grupos formados. Diante deste problema, o presente trabalho teve por objetivos: 1) Avaliar a aplicabilidade de critério objetivo de se obter o ponto de corte (número ótimo de clusters) num dendrograma para a tomada de decisão; 2) trabalhar os conceitos de índices como RMSSTD (root mean square standard deviation) e RS (R-Squared), discutindo a contribuição de cada um destes na obtenção do número ótimo de clusters em acessos de Capsicum chinense; 3) aplicação do método, visando a identificar acessos divergentes de Capsicum chinense para serem utilizados em programas de melhoramento. Os índices RMSSTD e RS são calculados de acordo com as variáveis entre e dentro dos grupos formados, caracterizando uma forma objetiva para determinar o número ótimo. Para se obter o ponto de máxima curvatura da trajetória dos índices RMSSTD e RS em função do aumento do número de grupos (X), utilizou-se o Método da Máxima Curvatura Modificado. Foram analisadas, por meio da análise de agrupamentos, algumas características morfológicas de quarenta e nove acessos da espécie Capsicum chinense Jacq. do Banco de Germoplasma de Hortaliças da Universidade Federal de Viçosa. A partir das técnicas propostas agrupou-se os acessos, obtendo um número ótimo de grupos. Os resultados classificam os 49 acessos avaliados em apenas sete grupos de acordo com o gráfico do RMSSTD versus o número de grupos e o gráfico do RS versus o número de grupos.
APA, Harvard, Vancouver, ISO, and other styles
50

Разуваєва, А. Д. "Дослідження ступеня сумісності міжнародних стандартів на системи управління якістю (ISO 9001:2015) та системи управління щодо протидії корупції (ISO 37001:2016)." Master's thesis, Сумський державний університет, 2020. https://essuir.sumdu.edu.ua/handle/123456789/82187.

Full text
Abstract:
Разуваєва, А. Д. Дослідження ступеня сумісності міжнародних стандартів на системи управління якістю (ISO 9001:2015) та системи управління щодо протидії корупції (ISO 37001:2016) [Текст] : робота на здобуття кваліфікаційного ступеня магістра : 152 – метрологія та інформаційно-вимірювальна техніка / Антоніна Дмитрівна Разуваєва; наук. керівники О. В. Івченко та Ясюлевич-Качмарек Малгожата. – Суми : СумДУ, ЦЗДВН, 2020 – 78 с.
Кваліфікаційна робота магістра становить 78 сторінки, в тому числі 23 рисунки, вісім таблиць, бібліографії із 28 джерел на трьох сторінках, одного додатку на чотирьох сторінках. Метою роботи є розробка рекомендацій щодо впровадження інтегрованих систем менеджменту відповідно до вимог міжнародних стандартів ISO 9001:2015 та ISO 37001:2016 шляхом дослідження ступеня сумісності вимог цих стандартів на системи управління на основі вдосконалення методології обчислення ступеню їх сумісності. Об’єкт дослідження – процес оцінювання ступеня сумісності вимог міжнародних стандартів ISO 9001:2015 та ISO 37001:2016. Предмет дослідження – нормативне забезпечення впровадження вимог міжнародного стандарту ISO 37001:2016 під час розбудови та підтримки систем управління щодо протидії корупції. Наукова новизна отриманих результатів. Отримав подальший розвиток підхід щодо досліджень ступеню сумісності вимог нормативних документів на основі застосування алгоритму оцінювання анормальності результатів вимірювань при апріорі невідомих середньоквадратичному відхиленні і генеральному середньому в частині застосування семи бальної шкали експертного оцінювання. Це дозволяє оптимізувати роботи з впровадження вимог міжнародного стандарту ISO 37001:2016 під час розбудови та підтримки систем управління щодо протидії корупції. Практичне значення отриманих результатів. На основі обчислень ступеня сумісності вимог міжнародних стандартів ISO 9001:2015 та ISO 37001:2016 запропоновано перелік документованої інформації стосовно впровадження системи управління щодо протидії корупції відповідно до вимог стандарту ISO 37001:2016.
Квалификационная работа магистра составляет 78 страницы, в том числе 23 рисунков, восемь таблиц, библиографии из 28 источников на трех страницах, одного приложения на четырех страницах. Целью работы является разработка рекомендаций по внедрению интегрированных систем менеджмента в соответствии с требованиями международных стандартов ISO 9001: 2015 и ISO 37001: 2016 путем исследования степени совместимости требований этих стандартов на системы менеджмента на основе совершенствования методологии исчисления степени их совместимости. Объект исследования - процесс оценки степени совместимости требований международных стандартов ISO 9001: 2015 и ISO 37001: 2016. Предмет исследования - нормативное обеспечение внедрения требований международного стандарта ISO 37001: 2016 при развития и поддержки систем управления по противодействию коррупции. Научная новизна полученных результатов. Получил дальнейшее развитие подход к исследованиям степени совместимости требований нормативных документов на основе применения алгоритма оценки анормальность результатов измерений при априори неизвестных среднеквадратичному отклонении и генеральному среднем в части применения семи бальной шкалы экспертного оценивания. Это позволяет оптимизировать работы по внедрению требований международного стандарта ISO 37001: 2016 при развития и поддержки систем управления по противодействию коррупции. Практическое значение полученных результатов. На основе вычислений степени совместимости требований международных стандартов ISO 9001: 2015 и ISO 37001: 2016 предложен перечень документированной информации по внедрению системы управления по противодействию коррупции в соответствии с требованиями стандарта ISO 37001: 2016.
The master's thesis is 78 pages, including 23 figures, eight tables, bibliographies from 28 sources on three pages, one appendix on four pages. The aim is to develop recommendations for the implementation of integrated management systems in accordance with the requirements of international standards ISO 9001: 2015 and ISO 37001: 2016 by studying the degree of compatibility of these standards for management systems based on improving the methodology for calculating their compatibility. The object of research is the process of assessing the degree of compatibility of the requirements of international standards ISO 9001: 2015 and ISO 37001: 2016. The subject of the study is the regulatory support for the implementation of the requirements of the international standard ISO 37001: 2016 during the development and maintenance of anti-corruption management systems. Scientific novelty of the obtained results. The approach to studies of the degree of compatibility of regulatory documents based on the application of the algorithm for assessing the abnormality of measurement results with a priori unknown standard deviation and the general average in terms of the application of a seven-point scale of expert evaluation. This allows optimizing the implementation of the international standard ISO 37001: 2016 during the development and maintenance of anti-corruption management systems. The practical significance of the results. Based on the calculations of the degree of compatibility of the requirements of the international standards ISO 9001: 2015 and ISO 37001: 2016, a list of documented information on the implementation of the anti-corruption management system in accordance with the requirements of ISO 37001: 2016 is proposed.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography