To see the other types of publications on this topic, follow the link: Measured value.

Dissertations / Theses on the topic 'Measured value'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Measured value.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Haug, Even. "Methods for Extreme Value Statistics Based on Measured Time Series." Thesis, Norwegian University of Science and Technology, Department of Mathematical Sciences, 2008. http://urn.kb.se/resolve?urn=urn:nbn:no:ntnu:diva-9733.

Full text
Abstract:

The thesis describes the Average Exceedance Rate (AER) method, which is a method for predicting return levels from sampled time series. The AER method is an alternative to the Peaks over threshold (POT) method, which is based on the assumption that data exceeding a certain threshold will behave asymptotically. The AER methods avoids this assumption by using sub-asymptotic data instead. Also, instead of using declustering to obtain independent data, correlation among the data is dealt with by assuming a Markov-like property. A practical procedure for using the AER method is proposed and tested on two sets of real data. These are a set of wind speed data from Norway and a set of wave height data from the Norwegian continental shelf. From the results, the method appears to give satisfactory results for the wind speed data, but for the wave height data its use appears to be invalid. However, the method itself seems to be robust, and to have certain advantages when compared to the POT method.

APA, Harvard, Vancouver, ISO, and other styles
2

McFarland, Kathryne L. "Can Principals Identify Value-Adding Teachers? Can Principals Accurately Identify Effective Teachers as Measured by Value-Added Analysis?" Ohio University / OhioLINK, 2013. http://rave.ohiolink.edu/etdc/view?acc_num=ohiou1377251390.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gogotya, Ntombizodwa Wonkie. "Productivity in South Africa as measured by changes in value added per employee per year." Thesis, Stellenbosch : Stellenbosch University, 2004. http://hdl.handle.net/10019.1/50069.

Full text
Abstract:
Thesis (MBA)--Stellenbosch University, 2004.
ENGLISH ABSTRACT: One of the objectives of corporate reporting is the communication of information on a company's performance to all stakeholders. The traditional financial statements (balance sheet, income statement and the cash flow statement) do not sufficiently meet all of the above requirements. In view of this, the study project acknowledges the need for corporate reporting beyond the traditional conventional financial reports. This therefore necessitated the use of a Value-Added Statement (VAS) as one of the financial statements that is regarded to have the ability to enhance corporate reporting. A VAS is based on an economic concept and, therefore, a contribution of a specific company towards the Gross Domestic Product (GOP) can be directly measured. Although a VAS does not solely disclose all of the information pertaining to the economic performance of business enterprise, it is believed that the statement can assist interested parties in making well-informed economic decisions. However, the publication of a VAS is still not a statutory regulation in South Africa. The findings indicate some limitations in the manner in which a VAS is published. The format is not statutory and is not audited, but there are opportunities for further research and improvement. This aspect has unfortunately led some users to mistrust the statement. For example, it almost always indicates that the labour component takes most of the value added (Hird, 1983). Statistical tests (e.g. Shapiro-Wilk's W Spearman R Test, histograms) have been conducted. These tests show a weak negative relationship between change in number of employees and change in value added by each employee. This suggests that value added per employee is not the only factor that contributes to productivity. There is therefore not enough evidence to conclude that companies that reduce the number of employees improve productivity.
AFRIKAANSE OPSOMMING: Een van die doelwitte van korporatiewe verslaggewing is om inligting oor die prestasie van 'n maatskappy aan alle belangegroepe te kommunikeer. Die tradisionele finansiële state (balansstaat, inkomstestaat en kontantvloeistaat) voldoen nie heeltemal aan bogenoemde vereistes nie. In die lig hiervan erken die studieprojek die behoefte aan korporatiewe verslaggewing bo en behalwe die tradisionele finansiële verslae. Dit het dus die gebruik van die toegevoegdewaardestaat (TWS) genoodsaak as een van die finansiële state wat daartoe kan bydra om korporatiewe verslaggewing te verbeter. 'n TWS is gebaseer op 'n ekonomiese konsep. Daarom kan 'n bydrae van 'n spesifieke maatskappy tot die Bruto Binnelandse Produk (BBP) direk gemeet word. Hoewel 'n toegevoegdewaardestaat nie op sy eie al die inligting oor die ekonomiese prestasie van 'n besigheidsonderneming blootlê nie, kan dit belangstellende partye help om ingeligte ekonomiese besluite te neem. Die publikasie van 'n toegevoegdewaardestaat is egter nog nie 'n statutêre regulasie in Suid-Afrika nie. Die bevindinge dui op 'n aantal beperkinge in die wyse waarop 'n TWS gepubliseer word. Die formaat is nie statutêr nie en word nie geouditeer nie, maar daar is geleenthede vir verdere navorsing en verbetering. Hierdie aspek het ongelukkig daartoe gelei dat sommige gebruikers die staat wantrou. Byvoorbeeld: Die VAS dui feitlik altyd aan dat die arbeidskomponent die meeste van die toegevoegde waarde opneem (Hird, 1983). Statistiese toetse (bv. Shapiro-Wilk se W Spearman R Toets, histogramme) is uitgevoer. Hierdie toetse dui op 'n swak negatiewe verhouding tussen verandering in die aantal werknemers en verandering in die waarde wat deur elke werknemer toegevoeg word. Dit dui daarop dat die waarde wat per werknemer toegevoeg word nie die enigste faktor is wat bydra tot produktiwiteit nie. Daarom lewer dit nie genoegsaam bewys om tot die gevolgtrekking te kom dat maatskappye wat hul aantal werknemers verminder terselfdertyd produktiwiteit verhoog nie.
APA, Harvard, Vancouver, ISO, and other styles
4

Hilbert, Anja, L. Schäfer, C. Hübner, T. Carus, B. Herbig, F. Seyfried, S. Kaiser, and A. Dietrich. "Pre- and postbariatric subtypes and their predictive value for health-related outcomes measured three years after surgery." Universität Leipzig, 2019. https://ul.qucosa.de/id/qucosa%3A38020.

Full text
Abstract:
Background: Although bariatric surgery is the most effective treatment for severe obesity, a subgroup of patients shows insufficient postbariatric outcomes. Differences may at least in part result from heterogeneous patient profiles regarding reactive and regulative temperament, emotion dysregulation, and disinhibited eating. This study aims to subtype patients based on these aspects before and two years after bariatric surgery and tests the predictive value of identified subtypes for health-related outcomes three years after surgery.
APA, Harvard, Vancouver, ISO, and other styles
5

Wong, Peter Kim-Hung. "QUANTIFYING THE PERCEIVED VALUE OF PHARMACY SERVICES AS MEASURED BY THE CONTINGENT VALUATION METHOD: FOCUS ON COMMUNITY PHARMACY." University of Cincinnati / OhioLINK, 2000. http://rave.ohiolink.edu/etdc/view?acc_num=ucin980272432.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Stewart, Robert L. (Robert Lee) 1960. "The Relationships Between the TeacherInsight Score and Student Performance As Measured by Student TAKS Academic Change Scores." Thesis, University of North Texas, 2014. https://digital.library.unt.edu/ark:/67531/metadc700102/.

Full text
Abstract:
The purpose of this study was to investigate the relationship between TeacherInsight™ (TI) scores and student performance as measured by student academic change scores on the Texas Assessment of Knowledge and Skills (TAKS) test. School district administrators, particularly district personnel administrators, are continually faced with the task of screening and hiring potential teacher applicants who are expected to influence student achievement outcomes directly. Efforts to make the screening, selection, and hiring process more efficient and effective have led to the use of certain teacher prescreening selection instruments that provide a research-based assessment of teachers’ affective attributes, which purportedly predicts teacher effectiveness. This study addressed this concern using a teacher screening and selection tool, the TI, design by the Gallup Organization. According to the Gallup Organization, the TI is a predictor of teacher affective attributes or talents. The state of Texas uses a student evaluation process called the TAKS to measure student academic gains in certain subject areas. This study examined the relationship between the TI and teacher effectiveness as measured by student academic TAKS change scores in mathematics in fourth and fifth grade. I used data obtained from a single school district in north central Texas. The specific targeted population consisted of 874 students enrolled in mathematics and 44 fourth- and fifth-grade teachers hired over a 3-year period (20082011). I applied a quantitative causal-comparative research design. Descriptive statistics for all variables were presented and bivariate relationships between continuous variables were examined. A two-level linear regression model was used to predict student performance on state-mandated assessments using teachers’ TI scores while controlling for relevant covariates. The statistical significance level throughout the study was set at α = .05. A major finding of this study revealed that teachers’ TI scores were not significant predictors of student achievement in the final model (p = .351). Moreover, the final model did not have significant predictive power when compared to the null model. The findings suggest that other factors not recorded in this dataset may influence student academic performance. Only student gender was a significant predictor of TAKS scores. However, the effect size indicated that student gender accounted for less than 1% of the variance in student achievement (R2 = .003). The findings of this study indicate that the TI alone should not be used as the sole instrument in predicting the quality and potential influence a teacher candidate will have on student performance on state-mandated assessments, and the selected school district should consider re-evaluating its use of the screening instrument for selecting teachers. Recommendations based on the results of the study are discussed and areas for future research are provided.
APA, Harvard, Vancouver, ISO, and other styles
7

Phillips, Roger. "The predictive value of in vitro chemosensitivity tests of anticancer drugs : in vitro chemosensitivity of a panel of murine colon tumours determined by a colony forming assay at drug exposure parameters measured in vivo." Thesis, University of Bradford, 1988. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.329305.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Erasmus, Petrus Daniel. "Evaluating value based financial performance measures." Thesis, Stellenbosch : University of Stellenbosch, 2008. http://hdl.handle.net/10019.1/1407.

Full text
Abstract:
Thesis (PhD (Economics))--University of Stellenbosch, 2008.
The primary financial objective of a firm is the maximisation of its shareholders’ value. A problem faced by the shareholders of a firm is that it is difficult to determine the effect of management decisions on the future share returns of the firm. Furthermore, it may be necessary to implement certain monitoring costs to ensure that management is focused on achieving this objective. A firm would, therefore, benefit from being able to identify those financial performance measures that are able to link the financial performance of the firm to its share returns. Implementing such a financial performance measure in the valuation and reward systems of a firm should ensure that management is aligned with the objective of shareholder value maximisation, and rewarded for achieving it. A large number of traditional financial performance measures have been developed. These measures are often criticised for excluding a firm’s cost of capital, and are considered inappropriate to be used when evaluating value creation. Furthermore, it is argued that these measures are based on accounting information, which could be distorted by Generally Accepted Accounting Practice (GAAP). Studies investigating the relationship between these measures and share returns also provide conflicting results. As a result of the perceived limitations of traditional measures, value based financial performance measures were developed. The major difference between the traditional and value based measures is that the value based measures include a firm’s cost of capital in their calculation. They also attempt to remove some of the accounting distortions resulting from GAAP. Proponents of the value based measures present these measures as a major improvement over the traditional financial performance measures and report high levels of correlation between the measures and share returns. A number of studies containing contradictory results have been published. On the basis of these conflicting results it is not clear whether the value based measures are able to outperform the traditional financial performance measures in explaining share returns. The primary objectives of this study are thus to: • Determine the relationship between the traditional measures earnings before extraordinary items (EBEI) and cash from operations (CFO), and shareholder value creation; • Investigate the value based measures residual income (RI), economic value added (EVA), cash value added (CVA) and cash flow return on investments (CFROI), and to determine their relationship with the creation of shareholder value; • Evaluate the incremental information content of the value based measures above the traditional measures. The information content of the traditional measures and the value based measures are evaluated by employing an approach developed by Biddle, Bowen and Wallace (1997). The first phase of this approach entails the evaluation of the relative information content of the various measures in order to determine which measure explains the largest portion of a firm’s market-adjusted share returns. The second phase consists of an evaluation of the incremental information content of the components of a measure in order to determine whether the inclusion of an additional component contributes statistically significant additional information beyond that contained in the other components. The study is conducted for South African industrial firms listed on the Johannesburg Securities Exchange for the period 1991 to 2005. The data required to calculate the measures investigated in the study are obtained from the McGregor BFA database. This database contains annual standardised financial statements for listed and delisted South African firms. It also contains EVA, cost of capital and invested capital amounts for those firms listed at the end of the research period. Including only these listed firms in the research sample would expose the study to a survivorship bias. Hence these values are estimated for those firms that delisted during the period under review by employing a similar approach to the one used in the database. The resulting sample consists of 364 firms providing 3181 complete observations. Since different information is required to calculate the various measures included in the study, different samples are compiled from this initial sample and included in the tests conducted to evaluate the information content of the measures. The results of this study indicate that the value based measures are not able to outperform EBEI in the majority of the relative information content tests. Furthermore, the measures EVA, CVA and CFROI are also not able to outperform the relatively simple value based measure RI. The results from the incremental information content tests indicate that although some of the components of the value based measures provide statistically significant incremental information content, the level of significance for these relatively complex adjustments is generally low. Based on these results, the claims made by the proponents of the value based measures cannot be supported. Furthermore, if a firm intends to incorporate its cost of capital in its financial performance measures, the measure RI provides most of the benefits contained in the other more complex value based measures.
APA, Harvard, Vancouver, ISO, and other styles
9

Schulle, Polly Jane. "Spaces of operators containing co and/or l ∞ with an application of vector measures." Thesis, University of North Texas, 2008. https://digital.library.unt.edu/ark:/67531/metadc9036/.

Full text
Abstract:
The Banach spaces L(X, Y), K(X, Y), Lw*(X*, Y), and Kw*(X*, Y) are studied to determine when they contain the classical Banach spaces co or l ∞. The complementation of the Banach space K(X, Y) in L(X, Y) is discussed as well as what impact this complementation has on the embedding of co or l∞ in K(X, Y) or L(X, Y). Results concerning the complementation of the Banach space Kw*(X*, Y) in Lw*(X*, Y) are also explored and how that complementation affects the embedding of co or l ∞ in Kw*(X*, Y) or Lw*(X*, Y). The l p spaces for 1 ≤ p < ∞ are studied to determine when the space of compact operators from one l p space to another contains co. The paper contains a new result which classifies these spaces of operators. Results of Kalton, Feder, and Emmanuele concerning the complementation of K(X, Y) in L(X, Y) are generalized. A new result using vector measures is given to provide more efficient proofs of theorems by Kalton, Feder, Emmanuele, Emmanuele and John, and Bator and Lewis as well as a new proof of the fact that l ∞ is prime.
APA, Harvard, Vancouver, ISO, and other styles
10

Ganief, Moegamad Shahiem. "Development of value at risk measures : towards an extreme value approach." Thesis, Stellenbosch : Stellenbosch University, 2001. http://hdl.handle.net/10019.1/52189.

Full text
Abstract:
Thesis (MBA)--Stellenbosch University, 2001.
ENGLISH ABSTRACT: Commercial banks, investment banks, insurance companies, non-financial firms, and pension funds hold portfolios of assets that may include stocks, bonds, currencies, and derivatives. Each institution needs to quantify the amount of risk its portfolio is exposed to in the course of a day, week, month, or year. Extreme events in financial markets, such as the stock market crash of October 1987, are central issues in finance and particularly in risk management and financial regulation. A method called value at risk (VaR) can be used to estimate market risk. Value at risk is a powerful measure of risk that is gaining wide acceptance amongst institutions for the management of market risk. Value at Risk is an estimate of the largest lost that a portfolio is likely to suffer during all but truly exceptional periods. More precisely, the VaR is the maximum loss that an institution can be confident it would lose a certain fraction of the time over a particular period. The power of the concept is its generality. VaR measures are applicable to entire portfolios - encompassing many asset categories and multiple sources of risk. As with its power, the challenge of calculating VaR also stems from its generality. In order to measure risk in a portfolio using VaR, some means must be found for determining a return distribution for the portfolio. There exists a wide range of literature on different methods of implementing VaR. But, when one attempts to apply the results, several questions remain open. For example, given a VaR measure, how can the risk manager test that the particular measure at hand is appropriately specified? And secondly, given two different VaR measures, how can the risk manager pick the best measure? Despite the popularity of VaR for measuring market risk, no consensus has yet been reach as to the best method to implement this risk measure. The absence of consensus is in part derived from the realization that each method currently in use has some significant drawbacks. The aim of this project is threefold: to introduce the reader to the concept of VaR; present the theoretical basis for the general approaches to VaR computations; and to introduce and apply Extreme Value Theory to VaR calculations. The general approaches to VaR computation falls into three categories, namely, Analytic (Parametric) Approach, Historical Simulation Approach, and Monte Carlo Simulation Approach. Each of these approaches has its strengths and weaknesses, which will study more closely. The extreme value approach to VaR calculation is a relatively new approach. Since most observed returns are central ones, traditional VaR methods tend to ignore extreme events and focus on risk measures that accommodate the whole empirical distribution of central returns. The danger of this approach is that these models are prone to fail just when they are needed most - in large market moves, when institutions can suffer very large losses. The extreme value approach is a tool that attempts to provide the user with the best possible estimate of the tail area of the distribution. Even in the absence of useful historical data, extreme value theory provides guidance on the kind of distribution that should be selected so that extreme risks are handled conservatively. As an illustration, the extreme value method will be applied to a foreign exchange futures contract. The validity of EVT to VaR calculations will be tested by examining the data of the Rand/Dollar One Year Futures Contracts. An extended worked example will be provided wherein which attempts to highlight the considerable strengths of the methods as well as the pitfalls and limitations. These results will be compared to VaR measures calculated using a GARCH(l,l) model.
AFRIKAANSE OPSOMMING: Handelsbanke, aksepbanke, assuransiemaatskappye, nie-finansiële instellings en pensioenfondse beskik oor portefeuljes van finansiële bates soos aandele, effekte, geldeenhede en afgeleides. Elke instelling moet die omvang kan bepaal van die risiko waaraan die portefeulje blootgestel is in die loop van 'n dag, week, maand of jaar. Uitsonderlike gebeure op finansiële markte, soos die ineenstorting van die aandelemark in Oktober 1987, is van besondere belang vir finansies en veral vir risikobestuur en finansiële regulering. 'n Metode wat genoem word Waarde op Risiko (WoR), kan gebruik word om markverliese te meet. WoR is 'n kragtige maatstaf vir risiko en word deur vele instellings gebruik vir die bestuur van mark-risiko. Waarde op Risiko is 'n raming van die grootste verlies wat 'n portefeulje moontlik kan ly gedurende enige tydperk, met uitsluiting van werklik uitsonderlike tydperke. Van nader beskou, is WoR die maksimum verlies wat 'n instelling kan verwag om gedurende 'n sekere tydperk binne 'n bepaalde periode te ly. Die waarde van die konsep lê in die algemene aard daarvan. WoR metings is van toepassing op portefeuljes in dié geheel en dit omvat baie kategorieë bates en veelvuldige bronne van risiko. Soos met die waarde van die konsep, hou die uitdaging om WoR te bereken ook verband met die algemene aard van die konsep. Ten einde die risiko te bepaal in 'n portefeulje waar WoR gebruik word, moet metodes gevind word waarvolgens 'n opbrengsverdeling vir die portefeulje vasgestel kan word. Daar bestaan 'n groot verskeidenheid literatuur oor die verskillende metodes om WoR te implementeer. Wanneer dit egter kom by die toepassing van die resultate, bly verskeie vrae onbeantwoord. Byvoorbeeld, hoe kan die risikobestuurder aan die hand van 'n gegewe WoR-maatstaf toets of die spesifieke maatstaf reg gespesifiseer is? Tweedens, hoe kan die risikobestuurder die beste maatstaf kies in die geval van twee verskillende WoR-maatstawwe? Ondanks die feit dat WoR algemeen gebruik word vir die meting van markrisiko, is daar nog nie konsensus bereik oor die beste metode om hierdie benadering tot risikometing te implementeer nie. Die feit dat daar nie konsensus bestaan nie, kan deels daaraan toegeskryf word dat elkeen van die metodes wat tans gebruik word, ernstige leemtes het. Die doel van hierdie projek is om die konsep WoR bekend te stel, om die teoretiese grondslag te lê vir die algemene benadering tot die berekening van WoR en om die Ekstreme Waarde-teorie bekend te stel en toe te pas op WoR-berekenings. Die algemene benadering tot die berekening van WoR word in drie kategorieë verdeel naamlik die Analitiese (Parametriese) benadering, die Historiese simulasiebenadering en die Monte Carlo-simulasiebenadering. Elkeen van die benaderings het sterk- en swakpunte wat van nader ondersoek sal word. Die Ekstreme Waarde-benadering tot WoR is 'n relatief nuwe benadering. Aangesien die meeste opbrengste middelwaarde-gesentreer is, is tradisionele WoR-metodes geneig om uitsonderlike gebeure buite rekening te laat en te fokus op risiko-maatstawwe wat die hele empiriese verdeling van middelwaarde-gesentreerde opbrengste akkommodeer. Die gevaar bestaan dan dat hierdie modelle geneig is om te faal juis wanneer dit die meeste benodig word, byvoorbeeld in die geval van groot markverskuiwings waartydens organisasies baie groot verliese kan ly. Daar word beoog om met behulp van die Ekstreme Waarde-benadering aan die gebruiker die beste moontlike skatting van die stert-area van die verdeling te gee. Selfs in die afwesigheid van bruikbare historiese data verskaf die Ekstreme Waarde-teorie riglyne ten opsigte van die aard van die verdeling wat gekies moet word, sodat uiterste risiko's versigtig hanteer kan word. Ten einde hierdie metode te illustreer, word dit in hierdie studie toegepas op 'n termynkontrak ten opsigte van buitelandse wisselkoerse. Die geldigheid van die Ekstreme Waarde-teorie ten opsigte van WoR berekenings word getoets deur die data van die Rand/Dollar Eenjaartermynkontrak te bestudeer. 'n Volledig uitgewerkte voorbeeld word verskaf waarin die slaggate en beperkings asook die talle sterkpunte van die model uitgewys word. Hierdie resultate sal vergelyk word met 'n WoR-meting wat bereken is met die GARCH (1,1) model.
APA, Harvard, Vancouver, ISO, and other styles
11

Gootzeit, Joshua Holubec. "ACT process measures : specificity and incremental value." Diss., University of Iowa, 2014. https://ir.uiowa.edu/etd/1325.

Full text
Abstract:
A number of objective personality questionnaires have been published which aim to measure the six processes related to Acceptance and Commitment Therapy's model of treatment (acceptance, defusion, present moment awareness, self-as-context, values, and committed action). These measures operationally define these hypothesized processes in research settings. However, little research has been done to investigate whether these processes, as measured by these questionnaires, are differentiable from each other or from other, seemingly similar constructs such as distress tolerance and coping styles. Additionally, it is unclear whether these questionnaire measures have differing relationships with other potentially relevant constructs, such as psychopathology, functioning, and personality. The structure of these process measures was investigated across two participant samples. A multi-trait structure of ACT processes was found, with three higher order dimensions consisting of psychological inflexibility/cognitive fusion, mindfulness, and avoidance, as well as a number of distinguishable lower order traits. This structure was found across multiple samples, and measures of these factor analytically-derived traits were found to have incremental validity and to be distinguishable from other, superficially similar psychological processes. These results provide guidance for measurement selection and suggest future directions for scale development. Relevance to treatment outcome research is also discussed.
APA, Harvard, Vancouver, ISO, and other styles
12

Kelly, Annela Rämmer. "Weakly analytic vector-valued measures /." free to MU campus, to others for purchase, 1996. http://wwwlib.umi.com/cr/mo/fullcit?p9821334.

Full text
APA, Harvard, Vancouver, ISO, and other styles
13

Nováček, Adam. "Vyrovnání provozních dat v energetických procesech." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2015. http://www.nusl.cz/ntk/nusl-232141.

Full text
Abstract:
This thesis is focused on problem data reconciliation of measurements. The objective of this thesis was reconciled measured value from electric drum dryer to suit exactly to the mathematical model of drying. For solution was used nonlinear data reconciliation with constrained nonlinear optimization. The entire calculation is processed in programme MATLAB and outputs are graphs of reconciled values of measurement on dryer such as inlet and outlet temperature and humidity, differential pressure of exhaust moisture air, weight of laundry, atmospheric pressure and electric supply. Achieved solution can by characterized by an amount of evaporated water. Weight of wet and dry laundry are 27,7 kg a 17,7 kg. The calculated amount of evaporated water from measurements was almost 18,8 kg. With reconciled measurements it was 9,7 kg. Goals of the thesis were found more realistic values.
APA, Harvard, Vancouver, ISO, and other styles
14

Yao, Lihua. "Topics in measure-valued processes /." The Ohio State University, 1997. http://rave.ohiolink.edu/etdc/view?acc_num=osu1487947908401034.

Full text
APA, Harvard, Vancouver, ISO, and other styles
15

Scherman, Vanessa. "The validity of value-added measures in secondary schools." Thesis, Pretoria : [s.n.], 2007. http://upetd.up.ac.za/thesis/available/etd-09192007-140841/.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Forsgren, Johan. "How Low Can You Go? : Quantitative Risk Measures in Commodity Markets." Thesis, Uppsala universitet, Statistiska institutionen, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-314088.

Full text
Abstract:
The volatility model approach to forecasting Value at Risk is complemented with modelling of Expected Shortfalls using an extreme value approach. Using three models from the GARCH family (GARCH, EGARCH and GJR-GARCH) and assuming two conditional distributions, normal Gaussian and Student t’s distribution, to make predictions of VaR, the forecasts are used as a threshold for assigning losses to the distribution tail. The Expected Shortfalls are estimated assuming that the violations of VaR follow the Generalized Pareto distribution, and the estimates are evaluated. The results indicate that the most efficient model for making predictions of VaR is the asymmetric GJR-GARCH, and that assuming the t distribution generates conservative forecasts. In conclusion there is evidence that the commodities are characterized by asymmetry and conditional normality. Since no comparison is made, the EVT approach can not be deemed to be either superior or inferior to standard approaches to Expected Shortfall modeling, although the data intensity of the method suggest that a standard approach may be preferable.
APA, Harvard, Vancouver, ISO, and other styles
17

Perry, Thomas. "The validity, interpretation and use of school value-added measures." Thesis, University of Birmingham, 2016. http://etheses.bham.ac.uk//id/eprint/6773/.

Full text
Abstract:
This thesis examines the validity of school value-added measures and the validity of arguments for their interpretation and use. The opening chapters review the development of school value added measures, existing evidence on their properties and validity and their current use in research, policy and practice. The empirical results are based on four studies using English National Pupil Database data and a large, nationally-representative dataset of teacher-assessed attainment data for English pupils aged from 7 to 13. The findings all relate to the properties of school value-added measures and the seriousness of a number of threats to their validity. The four empirical studies examine the following issues: observable bias and error, inter-method reliability when compared to estimates from a quasi-experimental regression discontinuity design, stability of school value added scores and of specific cohorts over time, and consistency of school value-added scores within cohorts and between different school cohorts at a single point in time. The closing chapters discuss the validity of value-added measures in general and in relation to the areas of use identified. Individually and collectively, the results advance understanding of numerous threats to validity and have substantial implications for the use of value-added measures in research, policy and practice.
APA, Harvard, Vancouver, ISO, and other styles
18

Smith, Michael Bennet 1979. "Disparate measures: Poetry, form, and value in early modern England." Thesis, University of Oregon, 2010. http://hdl.handle.net/1794/11182.

Full text
Abstract:
xi, 198 p. : ill. A print copy of this thesis is available through the UO Libraries. Search the library catalog for the location and call number.
In early modern England the word "measure" had a number of different but related meanings, with clear connections between physical measurements and the measurement of the self (ethics), of poetry (prosody), of literary form (genre), and of capital (economics). In this dissertation I analyze forms of measure in early modern literary texts and argue that measure-making and measure-breaking are always fraught with anxiety because they entail ideological consequences for emerging national, ethical, and economic realities. Chapter I is an analysis of the fourth circle of Dante's Inferno . In this hell Dante portrays a nightmare of mis-measurement in which failure to value wealth properly not only threatens to infect one's ethical well-being but also contaminates language, poetry, and eventually the universe itself. These anxieties, I argue, are associated with a massive shift in conceptions of measurement in Europe in the late medieval period. Chapter II is an analysis of the lyric poems of Thomas Wyatt, who regularly describes his psychological position as "out of measure," by which he means intemperate or subject to excessive feeling. I investigate this self-indictment in terms of the long-standing critical contention that Wyatt's prosody is "out of measure," and I argue that formal and psychological expressions of measure are ultimately inseparable. In Chapter III I argue that in Book II of the Faerie Queene Edmund Spenser figures ethical progress as a course between vicious extremes, and anxieties about measure are thus expressed formally as a struggle between generic forms, in which measured control of the self and measured poetic composition are finally the same challenge Finally, in my reading of Troilus and Cressida I argue that Shakespeare portrays persons as commodities who are constantly aware of their own values and anxious about their "price." Measurement in this play thus constitutes a system of valuation in which persons attempt to manipulate their own value through mechanisms of comparison and through praise or dispraise, and the failure to measure properly evinces the same anxieties endemic to Dante's fourth circle, where it threatens to infect the whole world.
Committee in charge: George Rowe, Chairperson, English; Benjamin Saunders, Member, English; Lisa Freinkel, Member, English; Leah Middlebrook, Outside Member, Comparative Literature
APA, Harvard, Vancouver, ISO, and other styles
19

Etheridge, Alison Mary. "Asymptotic behaviour of some measure-valued diffusions." Thesis, University of Oxford, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.329943.

Full text
APA, Harvard, Vancouver, ISO, and other styles
20

Klacar, Dorde. "Estimating Expected Exposure for the Credit Value Adjustment risk measure." Thesis, Umeå universitet, Nationalekonomi, 2013. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-73104.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Taliaferro, Thomas. "Accounting for Value : Using Social Return on Investment (SROI) to measure the value created by CSR initiatives." Thesis, Stockholms universitet, Stockholm Resilience Centre, 2012. http://urn.kb.se/resolve?urn=urn:nbn:se:su:diva-78445.

Full text
Abstract:
The role of the corporation is shifting from an entity focused on making monetary profits to an organization focused on creating value for all of its stakeholders. Despite of this many of the guidelines, standards and reporting frameworks that have been developed to take into account the increasing stakeholder expectations only capture corporate inputs and outputs relating to social initiatives. By not understanding the value created by social initiatives information is missed that could be useful to the organization and its stakeholders. The purpose of this study has therefore been to see if the Social Return on Investment (SROI) methodology can be a viable tool for companies to use for measuring the value created by CSR activities. This has been accomplished via a case study of a CSR initiative funded by a multinational wind power company in India, and more specifically the building and use of a traditional water harvesting structure called a taanka. Having gone through the six steps of SROI, including monetization of all non-market social, environmental and economic values, the results show that for every Indian Rupee (INR) invested into the studied CSR initiative 29 INR of social value have been created for the stakeholders. The results also show the relation between different inputs and outcomes for the stakeholders affected by the initiative. By analyzing the results several lessons for the construction of future taankas can be learnt. Each taanka should for instance be constructed for as many households as possible and ownership should be shared by the users. More resources should also be allocated to following up the outcomes created by CSR initiatives to help to maximize the efficiency of the resources used to create social value. The methodology can also be used to understand the shared corporate and societal values created by measuring the value created for both the company and the stakeholders, which in turn is useful when deciding on the allocation of corporate resources.
APA, Harvard, Vancouver, ISO, and other styles
22

Eriksson, Kristofer. "Risk Measures and Dependence Modeling in Financial Risk Management." Thesis, Umeå universitet, Institutionen för fysik, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-85185.

Full text
Abstract:
In financial risk management it is essential to be able to model dependence in markets and portfolios in an accurate and efficient way. A high positive dependence between assets in a portfolio can be devastating, especially in times of crises, since losses will most likely occur at the same time in all assets for such a portfolio. The dependence is therefore directly linked to the risk of the portfolio. The risk can be estimated by several different risk measures, for example Value-at-Risk and Expected shortfall. This paper studies some different ways to measure risk and model dependence, both in a theoretical and empirical way. The main focus is on copulas, which is a way to model and construct complex dependencies. Copulas are a useful tool since it allows the user to separately specify the marginal distributions and then link them together with the copula. However, copulas can be quite complex to understand and it is not trivial to know which copula to use. An implemented copula model might give the user a "black-box" feeling and a severe model risk if the user trusts the model too much and is unaware of what is going. Another model would be to use the linear correlation which is also a way to measure dependence. This is an easier model and as such it is believed to be easier for all users to understand. However, linear correlation is only easy to understand in the case of elliptical distributions, and when we move away from this assumption (which is usually the case in financial data), some clear drawbacks and pitfalls become present. A third model, called historical simulation, uses the historical returns of the portfolio and estimate the risk on this data without making any parametric assumptions about the dependence. The dependence is assumed to be incorporated in the historical evolvement of the portfolio. This model is very easy and very popular, but it is more limited than the previous two models to the assumption that history will repeat itself and needs much more historical observations to yield good results. Here we face the risk that the market dynamics has changed when looking too far back in history. In this paper some different copula models are implemented and compared to the historical simulation approach by estimating risk with Value-at-Risk and Expected shortfall. The parameters of the copulas are also investigated under calm and stressed market periods. This information about the parameters is useful when performing stress tests. The empirical study indicates that it is difficult to distinguish the parameters between the stressed and calm market period. The overall conclusion is; which model to use depends on our beliefs about the future distribution. If we believe that the distribution is elliptical then a correlation model is good, if it is believed to have a complex dependence then the user should turn to a copula model, and if we can assume that history will repeat itself then historical simulation is advantageous.
APA, Harvard, Vancouver, ISO, and other styles
23

Hålén, Anna, and Carin Gerok. "Performance Measurement Systems: : How to measure improvements in a value stream." Thesis, KTH, Industriell produktion, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-141042.

Full text
Abstract:
Increasing efficiency in organisations and processes during the latest years has made the competition among companies tougher. To increase the efficiency the organisations have to analyse their processes and by doing that, gaps and bottlenecks can be found. Value stream mapping is a common method used when organisations are analysing their processes. Furthermore, in order for organisations to maintain competitive performance measurement systems can be used. Balanced scorecard is a performance measurement system model commonly used for this purpose. Additionally to an extensive literature research a case study at AstraZeneca has been conducted to get valuable in-depth knowledge. AstraZeneca was chosen since they must continuously improve their organisation and internal processes in order to maintain the position of being one of the world’s leading biopharmaceutical companies. The internal artwork process has been examined and served as a base for answering the research question. The purpose of the master thesis is to deepen the knowledge about how performance measurement system can be used to increase the efficiency within organisations. By the purpose stated the main research question to be investigated is: How is performance measurement system used to improve the value stream of artwork in a pharmaceutical company? The master thesis resulted in important conclusions being made. To start with, in order to develop and improve the value stream of the artwork process, third-generation balanced scorecard is to be used. Gaps and bottlenecks within the artwork process occurred partly due to lack of communication between employees. The third-generation balanced scorecard stresses the fact of using a two-way communication to visualise the organisation’s strategy resulting in a more efficient process. The fact that using a performance measurement system is not a one-time effort but should be used continuously to experience performance is also an important conclusion being made. The empirical contributionfrom this master thesis is the fact that in order to ensure the two-way communication a person responsible for the entire process at an operational level is needed. This person, the owner of the process/the sponsor, will make sure that the process’ vision and goals align with the organisation’s overall strategy as well as facilitate and ensure a two-way communication. Moreover, the master thesis resulted in recommendations regarding how AstraZeneca should work to improve their artwork process. Due to confidential information these recommendations are shown to AstraZeneca only and is not included in the published master thesis. Keywords: Performance management, performance measurement system, balanced scorecard, artwork, value stream, gap-analysis, bottleneck
Globalisering och ökad effektivitet inom organisationer leder till att konkurrensen mellan företag ständigt blir ökar. För att öka effektiviteten analyseras de interna processerna i syfte att identifiera gap och flaskhalsar. Värdeflödesanalys är en vanlig metod vid analys av identifiering. Vidare, performance measurement systems är ramverk som kan användas för att bibehålla konkurrenskraft. Ett exempel på performance measurement system som vanligen används är balanced scorecard. Utöver en omfattande litteraturstudie, har fördjupad kunskap erhållits genom en fallstudie påAstraZeneca. AstraZeneca valdes då de ständigt är måna om att förbättra sin organisation och deras interna processer för att bibehålla positionen som en av världens ledande bioteknikföretag. Den interna artworkprocessen har undersökts och utgör en bas för att besvara examensarbetets huvudsakliga forskarfråga. Syftet med examensarbetet har varit att fördjupa kunskaperna kring hur performance measurement systems kan användas för att öka effektiviteten inom organisationer och företag. Utifrån syftet har följande frågeställning tagits fram: Hur används performance measurement system för att förbättra värdeflödet av en artworkprocess inom ett läkemedelsföretag? Examensarbetet resulterade i flera slutsatser. För att utveckla och förbättra värdeflödet av artworkprocessen, ska tredje generationens balanced scorecard användas. Identifierade gap och flaskhalsar inom artworkprocessen har uppstått delvis på grund av bristande kommunikation emellan anställda. Tredje generationens balanced scorecard framhåller vikten av att använda en tvåvägskommunikation för att visualisera organisationens strategi vilket i sin tur kommer resulterar i ökad effektivitet. Användning av performance measurement system är inte en engångsföreteelse utan måste ske kontinuerligt för att uppnå bästa resultat. Examensarbetets empiriska bidrag ligger i hur tvåvägskommunikationen ska kunna säkerställas. För att säkra tvåvägskommunikationen måste en person bli utsedd att ha det totala ansvaret för processen. Personen eller ”sponsorn” ska se till att processens visioner och mål går i linje med organisationens övergripande strategi samt underlätta och säkerställa tvåvägskommunikationen. Slutligen, examensarbetet har resulterat i rekommendationer för hur AstraZeneca bör arbeta för att förbättra sin artworkprocess. På grund av sekretessbelagd information presenteras dessa förbättringar endast för AstraZeneca och finns därmed inte med i det publicerade examensarbetet. Nyckelord: Performance management, performance measurement system, balanced scorecard, artwork, value stream, gap-analysis, bottleneck
APA, Harvard, Vancouver, ISO, and other styles
24

Koort, Eve. "Uncertainty estimation of potentiometrically measured pH and pK[subscript a] values /." Online version, 2006. http://dspace.utlib.ee/dspace/bitstream/10062/599/5/koorteve.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

SILVA, PAULA TAVARES DA. "MEASURES OF ECONOMIC PERFORMANCE AND VALUE CREATION: THE CASE OF BRAZILIAN COMPANIES." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2010. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=16717@1.

Full text
Abstract:
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
A presente dissertação tem por objetivo a análise empírica do desempenho de uma amostra representativa das empresas brasileiras e de seus setores no período compreendido entre 1999 e 2008, através da medida de desempenho que tenha uma maior relação com o retorno dos ativos no mercado de capitais. A fim de se alcançar este objetivo, fez-se necessário obter valores de medidas de desempenho tradicionais das empresas, tais como o ROI, ROA, ROE e LPA e de medidas baseadas na criação de valor para o acionista, como o EVA (Valor Econômico Adicionado), para em seguida, correlacioná-los à medida de desempenho externa, o MVA (Valor de Mercado Agregado), tida como a que melhor reflete o retorno dos ativos, pois além de considerar o desempenho corrente da empresa, leva em conta as expectativas futuras do mercado em relação à mesma. Para este estudo, foram analisadas 47 empresas com capital aberto na BOVESPA, distribuídas em 12 setores da economia brasileira. A mais alta correlação obtida foi entre o EVA e o MVA da empresas, corroborando com a afirmação da empresa detentora dos direitos sobre o EVA, Stern Stewart & Co., de que o EVA é uma medida superior de desempenho de empresas. Usando o EVA como métrica de desempenho, foi observado que somente cinco das 47 empresas avaliadas construíram valor para o acionista ao longo da década, que os setores de petróleo e de mineração foram os que mais criaram riqueza para o país, e que o período associado ao governo Lula foi marcado por um acentuado aumento das expectativas de crescimento de EVAs futuros.
This essay aims the empirical analysis of the performance of a representative sample of brazilian companies and their sectors over the period 1999-2008, through the measure of performance that has greater relation to the return of assets in the capital market. In order to achieve this goal, it was necessary to obtain values of traditional performance measures, such as ROI, ROA, ROE and EPS, as well as measures based on the creation of shareholder value, such as EVA (Economic Value Added), to, afterwards, correlate them with the external performance measure, MVA (Market Value Added), regarded as the one that best reflects the return of assets, because it considers the company s current performance and takes into account future market expectations. This essay was based on a sample of 47 companies traded on BOVESPA, distributed in 12 sectors of the Brazilian economy. The highest correlation obtained was between EVA and MVA corroborating with Stern Stewart & Co’s statement that EVA is a superior measure of business performance. When using EVA as the performance measure, it was noted that only five of 47 companies evaluated built shareholder value over the decade, that the petroleum and mining sectors were the ones that have created more wealth for the country, and that the period associated with the Lula’s Government was marked by a sharp increase in expectations of future growth of EVAs.
APA, Harvard, Vancouver, ISO, and other styles
26

Hyun, Sunghyup. "Creating and Validating a Measure of Customer Equity in Hospitality Businesses: Linking Shareholder Value With Return on Marketing." Diss., Virginia Tech, 2009. http://hdl.handle.net/10919/28350.

Full text
Abstract:
Understanding the contribution of marketing to the shareholder value of a company has been a major challenge for marketing research. The purpose of this dissertation was creating and validating an attitudinal measure of customer equity in hospitality businesses, thus providing a link between return on marketing and the shareholder value of a company. The theoretical background of the customer equity construct was examined, and then systematic scale development processes were initiated. The results produced two concise scales: (1) 17 items that represent the six dimensions of customer equity in the restaurant industry and (2) 19 items that represent the six dimensions of customer equity in the hotel industry. Six dimensions of customer equity achieved strong convergent validity, discriminant validity, and internal consistency, indicating unidimensionality of the constructs. To further validate the newly developed scale, criterion validity was checked in correlation with six criterion measures using data collected from 590 hospitality industry consumers. The results demonstrate that customer equity closely reflects the shareholder value of a company. Also, it was found that value equity, brand equity, relationship equity, and service quality are significantly and positively correlated with overall customer equity of a company. In conclusion, customer equity represents the long-term value of a company, and reflects shareholder value of the company, thus providing a link with return on marketing investments. Theoretical and managerial implications are discussed.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
27

Mosmann, Gabriela. "Axiomatic systemic risk measures forecasting." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2018. http://hdl.handle.net/10183/178875.

Full text
Abstract:
Neste trabalho, aprofundamos o estudo sobre risco sistêmico via funções de agregação. Consideramos três carteiras diferentes como proxy para um sistema econômico, estas carteiras são consistidas por duas funções de agregação, baseadas em todos as ações do E.U.A, e um índice de mercado. As medidas de risco aplicadas são Value at Risk (VaR), Expected Shortfall (ES) and Expectile Value at Risk (EVaR), elas são previstas através do modelo GARCH clássico unido com nove funções de distribuição de probabilidade diferentes e mais por um método não paramétrico. As previsões são avaliadas por funções de perda e backtests de violação. Os resultados indicam que nossa abordagem pode gerar uma função de agregação adequada para processar o risco de um sistema previamente selecionado.
In this work, we deepen the study of systemic risk measurement via aggregation functions. We consider three different portfolios as a proxy for an economic system, these portfolios are consisted in two aggregation functions, based on all U.S. stocks and a market index. The risk measures applied are Value at Risk (VaR), Expected Shortfall (ES) and Expectile Value at Risk (EVaR), they are forecasted via the classical GARCH model along with nine distribution probability functions and also by a nonparametric approach. The forecasts are evaluated by loss functions and violation backtests. Results indicate that our approach can generate an adequate aggregation function to process the risk of a system previously selected.
APA, Harvard, Vancouver, ISO, and other styles
28

Dicks, Anelda. "Value at risk and expected shortfall : traditional measures and extreme value theory enhancements with a South African market application." Thesis, Stellenbosch : Stellenbosch University, 2013. http://hdl.handle.net/10019.1/85674.

Full text
Abstract:
Thesis (MComm)--Stellenbosch University, 2013.
ENGLISH ABSTRACT: Accurate estimation of Value at Risk (VaR) and Expected Shortfall (ES) is critical in the management of extreme market risks. These risks occur with small probability, but the financial impacts could be large. Traditional models to estimate VaR and ES are investigated. Following usual practice, 99% 10 day VaR and ES measures are calculated. A comprehensive theoretical background is first provided and then the models are applied to the Africa Financials Index from 29/01/1996 to 30/04/2013. The models considered include independent, identically distributed (i.i.d.) models and Generalized Autoregressive Conditional Heteroscedasticity (GARCH) stochastic volatility models. Extreme Value Theory (EVT) models that focus especially on extreme market returns are also investigated. For this, the Peaks Over Threshold (POT) approach to EVT is followed. For the calculation of VaR, various scaling methods from one day to ten days are considered and their performance evaluated. The GARCH models fail to converge during periods of extreme returns. During these periods, EVT forecast results may be used. As a novel approach, this study considers the augmentation of the GARCH models with EVT forecasts. The two-step procedure of pre-filtering with a GARCH model and then applying EVT, as suggested by McNeil (1999), is also investigated. This study identifies some of the practical issues in model fitting. It is shown that no single forecasting model is universally optimal and the choice will depend on the nature of the data. For this data series, the best approach was to augment the GARCH stochastic volatility models with EVT forecasts during periods where the first do not converge. Model performance is judged by the actual number of VaR and ES violations compared to the expected number. The expected number is taken as the number of return observations over the entire sample period, multiplied by 0.01 for 99% VaR and ES calculations.
AFRIKAANSE OPSOMMING: Akkurate beraming van Waarde op Risiko (Value at Risk) en Verwagte Tekort (Expected Shortfall) is krities vir die bestuur van ekstreme mark risiko’s. Hierdie risiko’s kom met klein waarskynlikheid voor, maar die finansiële impakte is potensieel groot. Tradisionele modelle om Waarde op Risiko en Verwagte Tekort te beraam, word ondersoek. In ooreenstemming met die algemene praktyk, word 99% 10 dag maatstawwe bereken. ‘n Omvattende teoretiese agtergrond word eers gegee en daarna word die modelle toegepas op die Africa Financials Index vanaf 29/01/1996 tot 30/04/2013. Die modelle wat oorweeg word sluit onafhanklike, identies verdeelde modelle en Veralgemeende Auto-regressiewe Voorwaardelike Heteroskedastiese (GARCH) stogastiese volatiliteitsmodelle in. Ekstreemwaarde Teorie modelle, wat spesifiek op ekstreme mark opbrengste fokus, word ook ondersoek. In hierdie verband word die Peaks Over Threshold (POT) benadering tot Ekstreemwaarde Teorie gevolg. Vir die berekening van Waarde op Risiko word verskillende skaleringsmetodes van een dag na tien dae oorweeg en die prestasie van elk word ge-evalueer. Die GARCH modelle konvergeer nie gedurende tydperke van ekstreme opbrengste nie. Gedurende hierdie tydperke, kan Ekstreemwaarde Teorie modelle gebruik word. As ‘n nuwe benadering oorweeg hierdie studie die aanvulling van die GARCH modelle met Ekstreemwaarde Teorie vooruitskattings. Die sogenaamde twee-stap prosedure wat voor-af filtrering met ‘n GARCH model behels, gevolg deur die toepassing van Ekstreemwaarde Teorie (soos voorgestel deur McNeil, 1999), word ook ondersoek. Hierdie studie identifiseer sommige van die praktiese probleme in model passing. Daar word gewys dat geen enkele vooruistkattingsmodel universeel optimaal is nie en die keuse van die model hang af van die aard van die data. Die beste benadering vir die data reeks wat in hierdie studie gebruik word, was om die GARCH stogastiese volatiliteitsmodelle met Ekstreemwaarde Teorie vooruitskattings aan te vul waar die voorafgenoemde nie konvergeer nie. Die prestasie van die modelle word beoordeel deur die werklike aantal Waarde op Risiko en Verwagte Tekort oortredings met die verwagte aantal te vergelyk. Die verwagte aantal word geneem as die aantal obrengste waargeneem oor die hele steekproefperiode, vermenigvuldig met 0.01 vir die 99% Waarde op Risiko en Verwagte Tekort berekeninge.
APA, Harvard, Vancouver, ISO, and other styles
29

Gill, Hardeep Singh. "Interacting measure-valued diffusions and their long-term behavior." Thesis, University of British Columbia, 2011. http://hdl.handle.net/2429/36923.

Full text
Abstract:
The focus of this dissertation is a class of random processes known as interacting measure-valued stochastic processes. These processes are related to another class of stochastic processes known as superprocesses. Both superprocesses and interacting measure-valued stochastic processes arise naturally from branching particle systems as scaling limits. A branching particle system is a collection of particles that propagate randomly through space, and that upon death give birth to a random number of particles (children). Therefore when the populations of the particle system and branching rate are large one can often use a superprocess to approximate it and carry out calculations that would be very difficult otherwise. There are many branching particle systems which do not satisfy the strong independence assumptions underlying superprocesses and thus are more difficult to study mathematically. This dissertation attempts to address two measure-valued processes with different types of dependencies (interactions) that the associated particles exhibit. In both cases, the method used to carry out this work is called Perkins' historical stochastic calculus, and has never before been used to investigate interacting measure-valued processes of these types. That is, we construct the measure-valued stochastic process associated with an interacting branching particle system directly without taking a scaling limit. The first type of interaction we consider is when all particles share a common chaotic drift from being immersed in the same medium, as well as having other types of individual interactions. The second interaction involves particles that attract to or repel from the center of mass of the entire population. For measure-valued processes with this latter interaction, we study the long-term behavior of the process and show that it displays some types of equilibria.
APA, Harvard, Vancouver, ISO, and other styles
30

Vaillancourt, Jean Carleton University Dissertation Mathematics. "Interacting Fleming-Viot processes and related measure-valued processes." Ottawa, 1987.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
31

Wirch, Julia Lynn. "Coherent Beta Risk Measures for Capital Requirements." Thesis, University of Waterloo, 1999. http://hdl.handle.net/10012/1106.

Full text
Abstract:
This thesis compares insurance premium principles with current financial risk paradigms and uses distorted probabilities, a recent development in premium principle literature, to synthesize the current models for financial risk measures in banking and insurance. This work attempts to broaden the definition of value-at-risk beyond the percentile measures. Examples are used to show how the percentile measure fails to give consistent results, and how it can be manipulated. A new class of consistent risk measures is investigated.
APA, Harvard, Vancouver, ISO, and other styles
32

Loggert, Josefin, and Mairon Åhlin. "Subjective perceptions of value : A qualitative case study using informal evaluation to measure the value of an Information System." Thesis, Umeå universitet, Institutionen för informatik, 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:umu:diva-104963.

Full text
Abstract:
The debate on how to measure the value of IT is an ongoing debate within the IT evaluation research. Research about the value of IT tends to focus on formal aspects and ignore informal aspects such as the subjective perceptions of individuals and researchers acknowledge a lack of informal evaluation methods used in practice. This study aims to answer two research questions: How can informal value of IS be evaluated? and What are the benefits and drawbacks of relying on informal value when assessing IS? To answer these questions a qualitative case study have been conducted using the DeLone & McLean IS Success Model as a theoretical framework. The results show that using the model was successful however we argue that important aspects were missing in the model. The benefits of relying on informal value showed to be the possibilities of discovering aspects of value that are not visible in formal evaluation. The drawbacks of performing an informal evaluation are that they are complicated and time-consuming and do not promise useful results.
APA, Harvard, Vancouver, ISO, and other styles
33

DANIONI, FRANCESCA VITTORIA. ""UNDERSTANDING HUMAN VALUES IS A NEVER-ENDING PROCESS": CHALLENGES IN VALUES MEASUREMENT." Doctoral thesis, Università Cattolica del Sacro Cuore, 2019. http://hdl.handle.net/10280/57794.

Full text
Abstract:
L’obiettivo generale del progetto di ricerca è quello di riflettere sul tema della misurazione dei valori nell’ambito delle scienze psicosociali. Secondo la Teoria di Schwartz, i valori sono definiti come obiettivi desiderabili e transituazionali che servono come principi guida nella vita delle persone per guidare e determinare le azioni e gli atteggiamenti. I valori sono stati prevalentemente indagati tramite l’utilizzo di strumenti self-report per raccogliere dati quantitativi. Tuttavia, le risposte a questi strumenti possono essere influenzate da diversi bias, come ad esempio la desiderabilità sociale, oppure possono dipendere dalla tendenza a riflettere in modo introspettivo delle persone che rispondono. Ciò accade principalmente perché i valori sono per definizione ciò che è desiderabile e sono inoltre concetti astratti. Sulla base di queste riflessioni, i Capitoli 1 e 2 considerano in modo teorico ed empirico gli strumenti self-report di misura dei valori e i bias che possono influenzare le risposte a questi strumenti. I Capitoli 3, 4, 5 e 6 considerano invece un recente sviluppo nel campo della misurazione dei valori, ovvero la possibilità di studiare questo costrutto adottando la prospettiva della cognizione sociale implicita, utilizzando quindi strumenti indiretti per acquisire conoscenza sul tema. Nel presente lavoro sono stati sviluppati due strumenti indiretti utili alla misurazione di valori, il Values Implicit Association Task and the Values Lexical Decision Task; tali strumenti sono stati anche analizzati in funzione della loro relazione con gli strumenti self-report di valori e con un outcome comportamentale.
The general aim of the present research project was to reflect on the measurement of values in the field of psychosocial sciences. According to Schwartz’s Theory, values are defined as desirable and trans-situational goals that serve as guiding principles in people's life to select modes, means, and actions. They have been mainly investigated using self-report instruments to gather quantitative data. However, respondents’ answers on these measures may be influenced by different response biases, such as for example socially desirable responding, or may depend on respondents’ tendency to introspection. This is mainly because values are by definition what is desirable, and they are abstract concepts. Based on this Chapters 1 and 2 theoretically and empirically deal with the available self-report measures of values and with the possible biases which are likely to influence respondents’ answers. Chapters 3 to 6 consider instead a recent trend in the field of values measurement, which is the possibility of studying values adopting an implicit social cognition perspective, that is using indirect measures to gain knowledge on the topic. Two indirect measures aimed at measuring values, namely the Values Implicit Association Test and the Values Lexical Decision Task, are here developed and considered in terms of their relations with self-report measures of values and with behavioural outcomes.
APA, Harvard, Vancouver, ISO, and other styles
34

Agarwala, Susama 1978. "Alternative values for sin(2beta) measured from electron/positron collisions at Babar." Thesis, Massachusetts Institute of Technology, 2001. http://hdl.handle.net/1721.1/44512.

Full text
Abstract:
Thesis (S.B.)--Massachusetts Institute of Technology, Dept. of Physics, 2001.
Includes bibliographical references (leaves 38-39).
Babar is measuring the value for sin(2[beta]) in the unitary triangle of neutral Bd mesons produced in e⁺e⁻ collision. This thesis explores a model of the [gamma]T(4S) resonance created in this collision that is composed of two one-state systems instead of one two-state system. Considering only neutral mesons, I write a Monte Carlo simulation to determine an adjusted value for [Delta]m and use this value to fit the data that Babar published. Based on this analysis, I find sin(2[beta]) = .75 ± .27, about double the value that Babar measures.
by Susama Agarwala.
S.B.
APA, Harvard, Vancouver, ISO, and other styles
35

Yildirim, Irem. "Coherent And Convex Measures Of Risk." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606519/index.pdf.

Full text
Abstract:
One of the financial risks an agent has to deal with is market risk. Market risk is caused by the uncertainty attached to asset values. There exit various measures trying to model market risk. The most widely accepted one is Value-at- Risk. However Value-at-Risk does not encourage portfolio diversification in general, whereas a consistent risk measure has to do so. In this work, risk measures satisfying these consistency conditions are examined within theoretical basis. Different types of coherent and convex risk measures are investigated. Moreover the extension of coherent risk measures to multiperiod settings is discussed.
APA, Harvard, Vancouver, ISO, and other styles
36

Prastorfer, Andreas. "Simulation-Based Portfolio Optimization with Coherent Distortion Risk Measures." Thesis, KTH, Matematisk statistik, 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-266382.

Full text
Abstract:
This master's thesis studies portfolio optimization using linear programming algorithms. The contribution of this thesis is an extension of the convex framework for portfolio optimization with Conditional Value-at-Risk, introduced by Rockafeller and Uryasev. The extended framework considers risk measures in this thesis belonging to the intersecting classes of coherent risk measures and distortion risk measures, which are known as coherent distortion risk measures. The considered risk measures belonging to this class are the Conditional Value-at-Risk, the Wang Transform, the Block Maxima and the Dual Block Maxima measures. The extended portfolio optimization framework is applied to a reference portfolio consisting of stocks, options and a bond index. All assets are from the Swedish market. The returns of the assets in the reference portfolio are modelled with elliptical distribution and normal copulas with asymmetric marginal return distributions. The portfolio optimization framework is a simulation-based framework that measures the risk using the simulated scenarios from the assumed portfolio distribution model. To model the return data with asymmetric distributions, the tails of the marginal distributions are fitted with generalized Pareto distributions, and the dependence structure between the assets are captured using a normal copula. The result obtained from the optimizations is compared to different distributional return assumptions of the portfolio and the four risk measures. A Markowitz solution to the problem is computed using the mean average deviation as the risk measure. The solution is the benchmark solution which optimal solutions using the coherent distortion risk measures are compared to. The coherent distortion risk measures have the tractable property of being able to assign user-defined weights to different parts of the loss distribution and hence value increasing loss severities as greater risks. The user-defined loss weighting property and the asymmetric return distribution models are used to find optimal portfolios that account for extreme losses. An important finding of this project is that optimal solutions for asset returns simulated from asymmetric distributions are associated with greater risks, which is a consequence of more accurate modelling of distribution tails. Furthermore, weighting larger losses with increasingly larger weights show that the portfolio risk is greater, and a safer position is taken.
Denna masteruppsats behandlar portföljoptimering med linjära programmeringsalgoritmer. Bidraget av uppsatsen är en utvidgning av det konvexa ramverket för portföljoptimering med Conditional Value-at-Risk, som introducerades av Rockafeller och Uryasev. Det utvidgade ramverket behandlar riskmått som tillhör en sammansättning av den koherenta riskmåttklassen och distortions riksmåttklassen. Denna klass benämns som koherenta distortionsriskmått. De riskmått som tillhör denna klass och behandlas i uppsatsen och är Conditional Value-at-Risk, Wang Transformen, Block Maxima och Dual Block Maxima måtten. Det utvidgade portföljoptimeringsramverket appliceras på en referensportfölj bestående av aktier, optioner och ett obligationsindex från den Svenska aktiemarknaden. Tillgångarnas avkastningar, i referens portföljen, modelleras med både elliptiska fördelningar och normal-copula med asymmetriska marginalfördelningar. Portföljoptimeringsramverket är ett simuleringsbaserat ramverk som mäter risk baserat på scenarion simulerade från fördelningsmodellen som antagits för portföljen. För att modellera tillgångarnas avkastningar med asymmetriska fördelningar modelleras marginalfördelningarnas svansar med generaliserade Paretofördelningar och en normal-copula modellerar det ömsesidiga beroendet mellan tillgångarna. Resultatet av portföljoptimeringarna jämförs sinsemellan för de olika portföljernas avkastningsantaganden och de fyra riskmåtten. Problemet löses även med Markowitz optimering där "mean average deviation" används som riskmått. Denna lösning kommer vara den "benchmarklösning" som kommer jämföras mot de optimala lösningarna vilka beräknas i optimeringen med de koherenta distortionsriskmåtten. Den speciella egenskapen hos de koherenta distortionsriskmåtten som gör det möjligt att ange användarspecificerade vikter vid olika delar av förlustfördelningen och kan därför värdera mer extrema förluster som större risker. Den användardefinerade viktningsegenskapen hos riskmåtten studeras i kombination med den asymmetriska fördelningsmodellen för att utforska portföljer som tar extrema förluster i beaktande. En viktig upptäckt är att optimala lösningar till avkastningar som är modellerade med asymmetriska fördelningar är associerade med ökad risk, vilket är en konsekvens av mer exakt modellering av tillgångarnas fördelningssvansar. En annan upptäckt är, om större vikter läggs på högre förluster så ökar portföljrisken och en säkrare portföljstrategi antas.
APA, Harvard, Vancouver, ISO, and other styles
37

Eksi, Zehra. "Comparative Study Of Risk Measures." Master's thesis, METU, 2005. http://etd.lib.metu.edu.tr/upload/12606501/index.pdf.

Full text
Abstract:
There is a little doubt that, for a decade, risk measurement has become one of the most important topics in finance. Indeed, it is natural to observe such a development, since in the last ten years, huge amounts of financial transactions ended with severe losses due to severe convulsions in financial markets. Value at risk, as the most widely used risk measure, fails to quantify the risk of a position accurately in many situations. For this reason a number of consistent risk measures have been introduced in the literature. The main aim of this study is to present and compare coherent, convex, conditional convex and some other risk measures both in theoretical and practical settings.
APA, Harvard, Vancouver, ISO, and other styles
38

Jodlowski, Edward. "Value-added measures of student growth| Where we are in Illinois post-PERA." Thesis, Southern Illinois University at Edwardsville, 2016. http://pqdtopen.proquest.com/#viewpdf?dispub=10132960.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Yates, Marinus. "Fundamental momentum as an investment timing indicator for value portfolios." Diss., University of Pretoria, 2012. http://hdl.handle.net/2263/23068.

Full text
Abstract:
The problem associated with value shares is that they may remain undervalued for an extended period of time. Therefore, determining when to buy value shares has been the focus of many investors and academics. Studies have determined fundamentals provide valuable information when selecting shares while price momentum provides a decent timing indicator. This research examines a novel share selection approach which seeks to combine fundamentals with momentum to obtain a leading timing indicator.This research seeks to determine if the fundamental momentum indicator can successfully and consistently separate value winners from value losers. The value portfolios were formed using a composite valuation measure made of three separate indicators. The Value portfolio was then ranked based on the strength of the fundamental momentum indicator.This research identified that Leverage Factor and Current Ratio momentum was able to separate value winners from losers in a consistent manner. However, only Current Ratio momentum was capable of creating portfolios which could consistently outperform the market. Therefore, this research identified that fundamental momentum could be used as a timing indicator when acquiring value shares.
Dissertation (MBA)--University of Pretoria, 2012.
Gordon Institute of Business Science (GIBS)
unrestricted
APA, Harvard, Vancouver, ISO, and other styles
40

Mittler, Markus. "Financial Analysis of Energy-Efficiency Measures in Commercial Real Estate : Quantifying the value adding characteristics of energy-efficiency measures in commercial real estate through asset value increase and yield on investment." Thesis, KTH, Energiteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-189482.

Full text
Abstract:
The purpose of this thesis was to establish and quantify the improved real estate appraisal value that’s a result of enhancing the energy performance of commercial real estate in Sweden. The study has been carried out in co-operation with the real estate investment company Sveafastigheter, which is a prominent real estate investment company in Sweden.The thesis was conducted by energy auditing two buildings in different temperature regions of Sweden in order to find energy-efficiency measures (EEM) that could be implemented in the buildings. These measures were then assembled to the most cost-efficient energy performance improving packages. After the EEM packages were identified the buildings characteristic economic performance has been calculated upon in order to see how the effects of energy-efficiency impact asset evaluation and subsequently yield on investment. The main objective was to produce an alternative way to evaluate EEMs as part of the audited buildings as a whole, taking into account the value increase of the asset allowing for a shorter investment recuperation period.Traditionally the profitability of EEMs is assessed by using the technical lifetime of the equipment as a framework for the investment timeline. The results in this thesis provide an alternative way to value the profitability of EEMs by taking into account the increased value of the building, and thus the increased sales income which makes an energy-efficiency investment perform inherently better in case the property is sold before the technical life time of the improved technology has come to an end.
Meningen med detta examensarbete var att examinera och kvantifiera kommersiella fastigheters förbättrade marknadsvärde som ett direkt resultat av energieffektiviseringsåtgärder i fastigheten för att sedan härleda en ny metod för lönsamhetskalkyler. Examensarbetet har utförts i samarbete med Sveafastigheter som är ett ledande fastighetsinvesteringsbolag i Norden.Examensarbetet utfördes genom att granska två kommersiella fastigheter i två olika temperaturzoner i Sverige för att hitta energieffektiviseringsåtgärder som sammanställts till de mest gynnsamma energieffektiviseringshelheterna. Dessa helheter har sedan förslagits för implementering i de nämnda fastigheterna. Efter detta har fastigheternas ekonomiska prestanda granskats för att se hur energieffektiviseringshelheterna kunde öka fastigheternas värde. Målet med examensarbetet var att slutligen skapa en lönsamhetskalkyl för energieffektiviseringshelheterna som tar fastighetens värdeökning i beaktande.Traditionellt värderas en energieffektiviseringsåtgärd genom att värdera teknisk livslängd och kostnad av apparaturen jämfört med dess årliga besparing. En sådan lönsamhetskalkyl ger en bild av hur lönsam investeringen är utan att ta fastighetens värdeökning i hänsyn. Detta examensarbete ger en insikt i hur reducerade kostnadsnivåer leder till ett högre fastighetsvärde och därmed en något förändrad lönsamhet vid implementering av energieffektiviseringsåtgärder i fastigheter som kan komma att säljas relativt snabbt efter att förbättringarna har utförts.
APA, Harvard, Vancouver, ISO, and other styles
41

De, Kock Mienie. "Absolute continuity and on the range of a vector measure." [Kent, Ohio] : Kent State University, 2008. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=kent1216134542.

Full text
Abstract:
Thesis (Ph.D.)--Kent State University, 2008.
Title from PDF t.p. (viewed Jan. 26, 2010). Advisor: Joseph Diestel. Keywords: absolute continiuty, range of a vector measure. Includes bibliographical references (p. 40-41).
APA, Harvard, Vancouver, ISO, and other styles
42

Karniychuk, Maryna. "Comparing Approximations for Risk Measures Related to Sums of Correlated Lognormal Random Variables." Master's thesis, Universitätsbibliothek Chemnitz, 2007. http://nbn-resolving.de/urn:nbn:de:swb:ch1-200700024.

Full text
Abstract:
In this thesis the performances of different approximations are compared for a standard actuarial and financial problem: the estimation of quantiles and conditional tail expectations of the final value of a series of discrete cash flows. To calculate the risk measures such as quantiles and Conditional Tail Expectations, one needs the distribution function of the final wealth. The final value of a series of discrete payments in the considered model is the sum of dependent lognormal random variables. Unfortunately, its distribution function cannot be determined analytically. Thus usually one has to use time-consuming Monte Carlo simulations. Computational time still remains a serious drawback of Monte Carlo simulations, thus several analytical techniques for approximating the distribution function of final wealth are proposed in the frame of this thesis. These are the widely used moment-matching approximations and innovative comonotonic approximations. Moment-matching methods approximate the unknown distribution function by a given one in such a way that some characteristics (in the present case the first two moments) coincide. The ideas of two well-known approximations are described briefly. Analytical formulas for valuing quantiles and Conditional Tail Expectations are derived for both approximations. Recently, a large group of scientists from Catholic University Leuven in Belgium has derived comonotonic upper and comonotonic lower bounds for sums of dependent lognormal random variables. These bounds are bounds in the terms of "convex order". In order to provide the theoretical background for comonotonic approximations several fundamental ordering concepts such as stochastic dominance, stop-loss and convex order and some important relations between them are introduced. The last two concepts are closely related. Both stochastic orders express which of two random variables is the "less dangerous/more attractive" one. The central idea of comonotonic upper bound approximation is to replace the original sum, presenting final wealth, by a new sum, for which the components have the same marginal distributions as the components in the original sum, but with "more dangerous/less attractive" dependence structure. The upper bound, or saying mathematically, convex largest sum is obtained when the components of the sum are the components of comonotonic random vector. Therefore, fundamental concepts of comonotonicity theory which are important for the derivation of convex bounds are introduced. The most wide-spread examples of comonotonicity which emerge in financial context are described. In addition to the upper bound a lower bound can be derived as well. This provides one with a measure of the reliability of the upper bound. The lower bound approach is based on the technique of conditioning. It is obtained by applying Jensen's inequality for conditional expectations to the original sum of dependent random variables. Two slightly different version of conditioning random variable are considered in the context of this thesis. They give rise to two different approaches which are referred to as comonotonic lower bound and comonotonic "maximal variance" lower bound approaches. Special attention is given to the class of distortion risk measures. It is shown that the quantile risk measure as well as Conditional Tail Expectation (under some additional conditions) belong to this class. It is proved that both risk measures being under consideration are additive for a sum of comonotonic random variables, i.e. quantile and Conditional Tail Expectation for a comonotonic upper and lower bounds can easily be obtained by summing the corresponding risk measures of the marginals involved. A special subclass of distortion risk measures which is referred to as class of concave distortion risk measures is also under consideration. It is shown that quantile risk measure is not a concave distortion risk measure while Conditional Tail Expectation (under some additional conditions) is a concave distortion risk measure. A theoretical justification for the fact that "concave" Conditional Tail Expectation preserves convex order relation between random variables is given. It is shown that this property does not necessarily hold for the quantile risk measure, as it is not a concave risk measure. Finally, the accuracy and efficiency of two moment-matching, comonotonic upper bound, comonotonic lower bound and "maximal variance" lower bound approximations are examined for a wide range of parameters by comparing with the results obtained by Monte Carlo simulation. It is justified by numerical results that, generally, in the current situation lower bound approach outperforms other methods. Moreover, the preservation of convex order relation between the convex bounds for the final wealth by Conditional Tail Expectation is demonstrated by numerical results. It is justified numerically that this property does not necessarily hold true for the quantile.
APA, Harvard, Vancouver, ISO, and other styles
43

Pride, Bryce L. "Sensitivity of Value Added School Effect Estimates to Different Model Specifications and Outcome Measures." University of South Florida, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
44

Güerere, Claudia. "Value-Added and Observational Measures Used in the Teacher Evaluation Process: A Validation Study." Scholar Commons, 2013. http://scholarcommons.usf.edu/etd/4678.

Full text
Abstract:
Scores from value-added models (VAMs), as used for educational accountability, represent the educational effect teachers have on their students. The use of these scores in teacher evaluations for high-stakes decision making is new for the State of Florida. Validity evidence that supports or questions the use of these scores is critically needed. This research, using data from 2385 teachers from 104 schools in one school district in Florida, examined the validity of the value-added scores by correlating these scores with scores from an observational rubric used in the teacher evaluation process. The VAM scores also were examined in relation to several variables that the literature had identified as correlates of quality teaching as well as variables that were theoretically independent of teacher performance. The observational rubric used in the validation process was based on Marzano's and Danielson's framework and consisted of 34 items and five factors (Ability to Assess Instructional Needs, Plans and Delivers Instruction, Maintains a Student-Centered Learning Environment, Performs Professional Responsibilities, Engages in Continuous Improvement for Self and School). Analyses of the psychometric properties of the observational rubric using confirmatory factor analysis supported the fit of the five-factor structure underlying the rubric. Internal consistency reliabilities for the five observational scales and total score ranged from .81 to .96. The relationships between the observational rubric scores and VAM scores (with and without the standard error of measurement (SE) applied to the VAM score) were generally weak for the overall sample (range of correlations = .05 to .09 for the five observational scales and VAM with SE; .14 to .18 for the five observational scales and VAM without SE). Inspection of the relationship between the VAM and total observational scores within each of the 104 schools revealed that while some schools had a strong relationship, the majority of the schools revealed little to no relationship between the two measures that represent a quality/effective teacher. The last part of this research investigated the relationship of the VAM scores and scores from the observational rubric with variables that had been identified in the literature as correlates of quality teaching. In addition, relationships between variables that the literature had shown to be independent of quality teaching were also examined. Results indicated that VAM scores were not significantly related to any of the predictor variables (e.g., National Board Certification, years of experience, gender, etc.). The observational rubric, on the other hand, had significant relations with National Board Certification, years of experience, and gender. The validity evidence provided in this research calls for caution when using VAM scores in teacher evaluations for high-stakes decision making. The weak relations between the observational scores of teachers' performance and teachers' value-added scores suggest that these measures are representing different dimensions of the multidimensional construct of teaching quality. Ongoing research is needed to better understand the strengths and limitations of both the observational and VAM measures and the reasons why these measures do not often converge. In addition, teacher factors (e.g., grade level) that can account for variation in both the VAM and observational scores need to be identified.
APA, Harvard, Vancouver, ISO, and other styles
45

Pride, Bryce L. "Sensitivity of Value Added School Effect Estimates to Different Model Specifications and Outcome Measures." Scholar Commons, 2012. http://scholarcommons.usf.edu/etd/4391.

Full text
Abstract:
The Adequate Yearly Progress (AYP) Model has been used to make many high-stakes decisions concerning schools, though it does not provide a complete assessment of student academic achievement and school effectiveness. To provide a clearer perspective, many states have implemented various Growth and Value Added Models, in addition to AYP. The purpose of this study was to examine two Value Added Model specifications, the Gain Score Model and the Layered Effects Model, to understand similarities and differences in school effect results. Specifically, this study correlated value added school effect estimates, which were derived from two model specifications and two outcome measures (mathematics and reading test scores). Existing data were obtained from a moderately large and rural school district in Florida. The outcome measures of 7,899 unique students were examined using the Gain Score Model and the Layered Effects Model to estimate school effects. Those school effect estimates were then used to calculate and examine the relationship between school rankings. Overall, the findings in this study indicated that the school effect estimates and school rankings were more sensitive to outcome measures than they were to model specifications. The mathematics and reading correlations from the Gain Score Model for school effects and school rankings were low (indicating high sensitivity), when advancing from Grades 4 to 5, and were moderate in other grades. The mathematics and reading correlations from the Layered Effects Model were low at Grade 5 for school effects and school rankings, as were the correlations at Grade 7 for the school rankings. In the other grades, correlations were moderate to high (indicating lower sensitivity). Correlations between the Gain Score Model and the Layered Effects Model from mathematics were high in each grade for both school effects and school rankings. Reading correlations were also high for each of the grades. These results were similar to the findings of previous school effects research and added to the limited body of literature. Depending upon the outcome measure used, school effects and rankings can vary significantly when using Value Added Models. These models have become a popular component in educational accountability systems, yet there is no one perfect model. If used, these models should be used cautiously, in addition to other accountability approaches.
APA, Harvard, Vancouver, ISO, and other styles
46

Muresan, Elisa Rinastiti. "An examination of bond rating, beta and value-at-risk as financial risk measures." Thesis, Robert Gordon University, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.400651.

Full text
APA, Harvard, Vancouver, ISO, and other styles
47

Wagner, Brooke. "The Predictive Value of Phonemic Awareness Curriculum-Based Measures on Kindergarten Word Reading Fluency." Thesis, University of Oregon, 2016. http://hdl.handle.net/1794/20543.

Full text
Abstract:
This manuscript synthesizes the importance of the alphabetic principles of reading, building blocks of teaching reading, indicators of early reading success, and curriculum-based measures (CBM) within the Response to Intervention (RtI) process from empirical research. A review of the literature reflects contrasting views on which specific pre-reading skill is most predictive of word reading success toward the end of kindergarten and the important role of CBM in such an analysis. Therefore, my research questions analyzed (a) the correlations between letter naming, letter sounds, phonemic segmentation, and word reading fluency in kindergarten; (b) the relative predictive relation of letter names, letter sounds, and phonemic segmentation measures to word reading fluency for kindergarten students; and, (c) the relation of non-academic variables of special education status, English language learner status, attendance, free-and-reduced-meals, and NonWhite Race to word reading fluency in kindergarten. Correlation results indicated the correlation between winter word reading fluency and spring word reading fluency in kindergarten was r = .82, spring word reading and fall letter sounds was r = .57, spring word reading and winter letter sounds was r = .66, and spring word reading and spring letter sounds was r =.58. All the non-academic variables weakly correlated to spring word reading, with the exception of fall attendance percentage showing a negative to low correlation range (-0.15 to 0.11). In addition, regression results indicated that Winter Word Reading Fluency (Winter WRF) (β = .64) was predictive of Spring Word Reading. Spring Letter Sounds (Spring LS) (β = .29) also were predictive of Spring Word Reading as was Fall Letter Sounds (Fall LS) (β = .11). These results frame practical implications for reading instruction that suggest ways in which schools and districts to think about staffing, instruction, and schedules to better meet student needs in preparation for state-mandated all-day kindergarten in the fall of 2017 and beyond.
APA, Harvard, Vancouver, ISO, and other styles
48

Heath, A. "Bayesian computations for Value of Information measures using Gaussian processes, INLA and Moment Matching." Thesis, University College London (University of London), 2018. http://discovery.ucl.ac.uk/10050229/.

Full text
Abstract:
Value of Information measures quantify the economic benefit of obtaining additional information about the underlying model parameters of a health economic model. Theoretically, these measures can be used to understand the impact of model uncertainty on health economic decision making. Specifically, the Expected Value of Partial Perfect Information (EVPPI) can be used to determine which model parameters are driving decision uncertainty. This is useful as a tool to perform sensitivity analysis to model assumptions and to determine where future research should be targeted to reduce model uncertainty. Even more importantly, the Value of Information measure known as the Expected Value of Sample Information (EVSI) quantifies the economic value of undertaking a proposed scheme of research. This has clear applications in research prioritisation and trial design, where economically valuable studies should be funded. Despite these useful properties, these two measures have rarely been used in practice due to the large computational burden associated with estimating them in practical scenarios. Therefore, this thesis develops novel methodology to allow these two measures to be calculated in practice. For the EVPPI, the method is based on non-parametric regression using the fast Bayesian computation method INLA (Integrated Nested Laplace Approximations). This novel calculation method is fast, especially for high dimensional problems, greatly reducing the computational time for calculating the EVPPI in many practical settings. For the EVSI, the approximation is based on Moment Matching and using properties of the distribution of the preposterior mean. An extension to this method also uses Bayesian non-linear regression to calculate the EVSI quickly across different trial designs. All these methods have been developed and implemented in R packages to aid implementation by practitioners and allow Value of Information measures to inform both health economic evaluations and trial design.
APA, Harvard, Vancouver, ISO, and other styles
49

Marks, Dean. "Monte Carlo methods for the estimation of value-at-risk and related risk measures." Master's thesis, University of Cape Town, 2011. http://hdl.handle.net/11427/10966.

Full text
Abstract:
Nested Monte Carlo is a computationally expensive exercise. The main contributions we present in this thesis are the formulation of efficient algorithms to perform nested Monte Carlo for the estimation of Value-at-Risk and Expected-Tail-Loss. The algorithms are designed to take advantage of multiprocessing computer architecture by performing computational tasks in parallel. Through numerical experiments we show that our algorithms can improve efficiency in the sense of reducing mean-squared error.
APA, Harvard, Vancouver, ISO, and other styles
50

Heimonen, A. (Ari). "On effective irrationality measures for some values of certain hypergeometric functions." Doctoral thesis, University of Oulu, 1997. http://urn.fi/urn:isbn:9514247191.

Full text
Abstract:
Abstract The dissertation consists of three articles in which irrationality measures for some values of certain special cases of the Gauss hypergeometric function are considered in both archimedean and non-archimedean metrics. The first presents a general result and a divisibility criterion for certain products of binomial coefficients upon which the sharpenings of the general result in special cases rely. The paper also provides an improvement concerning th e values of the logarithmic function. The second paper includes two other special cases, the first of which gives irrationality measures for some values of the arctan function, for example, and the second concerns values of the binomial function. All the results of the first two papers are effective, but no computation of the constants for explicit presentation is carried out. This task is fulfilled in the third article for logarithmic and binomial cases. The results of the latter case are applied to some Diophantine equations.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography