To see the other types of publications on this topic, follow the link: Informational Efficiency.

Dissertations / Theses on the topic 'Informational Efficiency'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Informational Efficiency.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

Ma, Shiguang. "Tests of informational efficiency of China's stock market /." Title page, contents and abstract only, 2000. http://web4.library.adelaide.edu.au/theses/09PH/09phm111.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Viteva, Svetlana. "The informational efficiency of the European carbon market." Thesis, University of Stirling, 2012. http://hdl.handle.net/1893/11204.

Full text
Abstract:
This thesis examines the informational efficiency of the European carbon market based on the European Union Emissions Trading Scheme (EU ETS). The issue is approached from three different perspectives. I explore whether the volatility embedded in carbon options is a rational forecast of subsequently realized volatility. Then, I investigate if, and to what extent, new information about the structural and institutional set-up of the market impacts the carbon price dynamics. Lastly, I examine whether the European carbon market is relevant for the firm valuations of covered companies. First, perhaps because the market is new and derivatives’ trading on emission allowances has only started recently, carbon options have not yet been extensively studied. By using data on options traded on the European Climate Exchange, this thesis examines an aspect of market efficiency which has been previously overlooked. Market efficiency suggests that, conditional upon the accuracy of the option pricing model, implied volatility should be an unbiased and efficient forecast of future realized volatility (Campbell et al., 1997). Black (1976) implied volatility and implied volatility estimates directly surveyed from market participants are used in this thesis to study the information content of carbon options. Implied volatility is found to be highly informative and directionally accurate in forecasting future volatility. There is no evidence, however, that volatility embedded in carbon options is an unbiased and efficient forecast of future realized volatility. Instead, historical volatility-based forecasts are shown to contain incremental information to implied volatility, particularly for short-term forecasts. In addition, this thesis finds no evidence that directly surveyed implied volatility estimates perform better as a forecast of future volatility relative to Black’s (1976) estimates. Second, the market sensitivity to announcements about the organizational and institutional set-up of the EU ETS is re-examined. Despite their importance for the carbon price formation, demand-side announcements and announcements about the post-2012 framework have not yet been researched. By examining a very comprehensive and updated dataset of announcements, this thesis adds to the earlier works of Miclaus et al. (2008), Mansanet-Bataller and Pardo (2009) and Lepone et al. (2011). Market participants are found to rationally incorporate new information about the institutional and regulatory framework of the emissions trading scheme into the carbon price dynamics. However, they seem to be unable to accurately assess the implications of inter-temporal banking and borrowing on pricing futures contracts with different maturities. The impact of macroeconomic conditions on the market responsiveness is investigated by splitting the dataset into subsamples according to two alternative methods: 1) a simple split into pre-crisis and full-crisis time periods, and 2) according to a Bai-Perron structural break test. Evidence is found that in the context of economic slowdown and known allowances oversupply, the relationship between the carbon price and its fundamentals (institutional announcements, energy prices and extreme weather) breaks down. These findings are consistent with the arguments in Hintermann (2010), Keppler and Mansanet-Bataller (2010) and Koop and Tole (2011) that carbon price drivers change in response to the differing context of the individual trading periods. Third, the role of carbon performance in firm valuation is understudied. Since companies were not obliged to disclose their carbon emissions prior to the launch of the EU ETS, there exists little empirical evidence of the effect of carbon performance on market value. Earlier studies of the European carbon market have only focused on the impact of ETS compliance on the profitability and competitiveness of covered companies (e.g. Anger and Oberndorfer, 2008). There is also little research on how the newly available emissions data has altered the carbon performance of companies. This thesis addresses these gaps in the literature by examining the stock price reactions of British and German firms on the day of verified emissions release under the EU ETS over the period 2006 – 2011. An event study is conducted using a Seemingly Unrelated Regressions model to deal with the event clustering present in the dataset. Limited evidence is found that investors use information about the carbon performance of companies in their valuations. The information contained in the carbon emissions reports is shown to be somewhat more important for companies with high carbon-intensive operations. This thesis finds no conclusive evidence that the cap-and-trade programme has been able to provide regulated companies with enough incentives to de-carbonize their operations. The market does not punish companies which continue to emit carbon at increasing rates or reward companies which improve their carbon performance. In brief, the results of the thesis suggest that the market is not fully efficient yet. Inefficiently priced carbon options may allow for arbitrage trades in the market. The inability of investors to incorporate rules on inter-temporal banking and borrowing of allowances across the different trading periods leads to significant price reactions when there should be none. A recessionary economic environment and a known oversupply of emission allowances have led to a disconnect between the carbon price and its fundamental drivers. And, lastly, the signal embedded in the carbon price is not strong enough to invoke investor action and turn carbon performance into a standard component of investment analysis.
APA, Harvard, Vancouver, ISO, and other styles
3

Bonnier, Jean-Baptiste. "Essays on commodity prices modelling and informational efficiency." Thesis, Nantes, 2021. http://www.theses.fr/2021NANT3001.

Full text
Abstract:
Les commodités jouent un rôle essentiel dans nos économies, et les marchés à terme occupent une place centrale dans la détermination de leur prix. L'objet de cette thèse est de participer à notre compréhension du comportement des prix des commodités, et de produire des prévisions sur la base de méthodes économétriques récentes. Pour la prévision, nous nous concentrons sur deux sujets différents pour trois commodités (le pétrole, le blé, et l'or): la prévision des prix à horizon mensuel à partir d'une grande base de données, et la prévision de la volatilité à horizon journalier à l'aide d'une procédure récente de sélection de variables pour la volatilité conditionnelle. Pour l’explication, nous nous intéressons à l’efficience informationnelle et à la découverte de l’information dans deux cadres différents : des régressions prédictives s’appuyant sur des données relatives à différentes théories, et une analyse de l’effet des changements des positions ouvertes de différents groupes de traders sur la volatilité
Commodities play an essential role in our economies, and futures markets are central in the determination of commodity prices. This thesis aims to participate to the understanding of the behaviour of commodity prices and to produce forecasts based on recently developed methods. With regards to forecasting, we focus on two different topics for three commodities (Crude oil, Wheat, and Gold): point forecasts at a one-month horizon from a large set of predictors, and one-day ahead volatility forecasts using covariates in a recent model selection method for conditional volatility. With regards to explanation, we are interested in informational efficiency and price discovery in two distinct frameworks: predictive regressions using data that pertain to different theories, and an analysis of the effect of changes in traders' positions on the volatility
APA, Harvard, Vancouver, ISO, and other styles
4

Silva, Paulo Miguel Pereira da. "Essays on the informational efficiency of credit default swaps." Doctoral thesis, Universidade de Évora, 2017. http://hdl.handle.net/10174/21092.

Full text
Abstract:
This thesis contributes to the strand of the financial literature on credit derivatives, in particular the credit default swaps (CDS) market. We present four inter-connected studies addressing CDS market efficiency, price discovery, informed trading and the systemic nature of the CDS market. The first study explores a specific channel through which informed traders express their views on the CDS market: mergers and acquisitions (M&A) and divestitures activities. We show that information obtained by major banks while providing these investment services is impounded by CDS rates prior to the operation announcement. The run-up to M&A announcements is characterized by greater predictability of stock returns using past CDS spread data. The second study evaluates the incremental information value of CDS open interest relative to CDS spreads using a large panel database of obligors. We find that open interest helps predict CDS rate changes and stock returns. Positive open interest growth precedes the announcement of negative earnings surprises, consistent with the notion that its predictive ability is linked to the disclosure of material information. The third study measures the impact on CDS market quality of the ban on uncovered sovereign CDS buying imposed by the European Union. Using panel data models and a difference-in-differences analysis, we find that the ban helped stabilize CDS market volatility, but was in general detrimental to overall market quality. Lastly, we investigate the determinants of open interest dynamics to uncover the channels through which CDS may endanger the financial system. Although we find information asymmetry and divergence of opinions on firms’ future performance as relevant drivers of open interest, our results indicate that systematic factors play a much greater influence. The growth of open interest for different obligors co-varies in time and is pro-cyclical. Funding costs and counterparty risk also reduce dealers’ willingness to incur inventory risk; Eficiência dos mercados de Credit Default Swaps Resumo: Esta tese investiga o mercado de derivados de crédito, e em particular o mercado de credit default swaps (CDS). São apresentados quatro estudos interligados abordando temáticas relacionadas com a eficiência informacional, a existência de negociação informada no mercado de CDS, e a natureza sistémica daquele mercado. O primeiro estudo analisa a existência de negociação informada no mercado de CDS antes de operações de aquisição, fusões ou venda de ativos relevantes. A nossa análise mostra uma reação dos prémios de CDS antes do anúncio daqueles eventos, sendo em alguns casos mais imediata do que a reação dos mercados acionistas. O segundo estudo avalia o conteúdo informativo das posições em aberto no mercado de CDS utilizando dados em painel de diferentes empresas ao longo do tempo. Os resultados indiciam que as posições em aberto podem ajudar a prever variações futuras dos prémios de CDS e retornos acionistas. Em acréscimo, verifica-se um aumento estatisticamente significativo das posições em aberto antes da divulgação de surpresas negativas nos resultados das empresas. O terceiro estudo mede os efeitos da proibição de posições longas em CDS sobre entidades soberanas pertencentes ao Espaço Único Europeu sem a detenção do ativo subjacente pelo comprador. A análise mostra um efeito negativo da proibição sobre a qualidade do mercado, pese embora se tenha assistido em simultâneo à redução da volatilidade. Por fim, são analisados os determinantes dos montantes associados a posições em aberto, com o intuito de compreender como o mercado de CDS pode influenciar o risco sistémico. Os resultados indicam que a assimetria de informação e a divergência de opiniões dos investidores influenciam aqueles montantes. Todavia, fatores sistemáticos como risco de contraparte, aversão ao risco e risco de re-financiamento parecem ser ainda mais relevantes por via do efeito que exercem no risco do balanço dos intermediários financeiros.
APA, Harvard, Vancouver, ISO, and other styles
5

Clark, Stephen Rhett. "Essays in insider trading, informational efficiency, and asset pricing." Diss., University of Iowa, 2014. https://ir.uiowa.edu/etd/1306.

Full text
Abstract:
In this dissertation, I consider a range of topics related to the role played by information in modern asset pricing theory. The primary research focus is twofold. First, I synthesize existing research in insider trading and seek to stimulate an expansion of the literature at the intersection of work in the insider trading and financial economics areas. Second, I present the case for using Peter Bossaerts's (2004) Efficiently Learning Markets (ELM) methodology to empirically test asset pricing models. The first chapter traces the development of domestic and international insider trading regulations and explores the legal issues surrounding the proprietary nature of information in financial markets. I argue that, practically, the reinvigoration of the insider trading debate is unfortunate because, in spite of seemingly unending efforts to settle the debate, we are no closer to answering whether insider trading is even harmful, much less worthy of legal action. In doing so, I challenge the conventional wisdom of framing insider trading research as a quest for resolution to the debate. By adopting an agnostic perspective on the desirability of insider trading regulations, I am able to clearly identify nine issues in this area that are fruitful topics for future research. The second chapter studies prices and returns for movie-specific Arrow-Debreu securities traded on the Iowa Electronic Markets. The payoffs to these securities are based on the movies' initial 4-week U.S. box office receipts. We employ a unique data set for which we have traders' pre-opening forecasts to provide the first direct test of Bossaerts's (2004) ELM hypothesis. We supplement the forecasts with estimated convergence rates to examine whether the prior forecast errors affect market price convergence. Our results support the ELM hypothesis. While significant deviations between initial forecasts and actual box-office outcomes exist, prices nonetheless evolve in accordance with efficient updating. Further, convergence rates appear independent of both the average initial forecast error and the level of disagreement in forecasts. Lastly, the third chapter revisits the theoretical justifications for Bossaerts's (2004) ELM, with the goal of providing clear, intuitive proofs of the key results underlying the methodology. The seemingly biggest hurdle to garnering more widespread adoption of the ELM methodology is the confusion that surrounds the use of weighted modified returns when testing for rational asset pricing restrictions. I attack this hurdle by offering a transparent justification for this approach. I then establish how and why Bossaerts's results extend from the case of digital options to the more practically relevant class of all limited-liability securities, including equities. I conclude by showing that the ELM restrictions naturally lend themselves to estimation and testing of asset pricing models, using weighted modified returns, in a Generalized Method of Moments (GMM) framework.
APA, Harvard, Vancouver, ISO, and other styles
6

Herath, Shanaka, and Gunther Maier. "Informational efficiency of the real estate market: A meta-analysis." Hanyang Economic Research Institute, 2015. http://dx.doi.org/10.17256/jer.2015.20.2.001.

Full text
Abstract:
The growing empirical literature testing informational efficiency of real estate markets uses data from various contexts and at different levels of aggregation. The results of these studies are mixed. We use a distinctive meta-analysis to examine whether some of these study characteristics and contexts lead to a significantly higher chance for identification of an efficient real estate market. The results generated through meta-regression suggest that use of stock market data and individual level data, rather than aggregate data, significantly improves the probability of a study concluding efficiency. Additionally, the findings neither provide support for the suspicion that the view of market efficiency has significantly changed over the years nor do they indicate a publication bias resulting from such a view. The statistical insignificance of other study characteristics suggests that the outcome concerning efficiency is a context-specific random manifestation for the most part. (authors' abstract)
APA, Harvard, Vancouver, ISO, and other styles
7

Arjoon, Vaalmikki. "Essays on the informational efficiency of an emerging equity market." Thesis, University of Nottingham, 2012. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.582008.

Full text
Abstract:
This doctoral thesis consists of three essays on the nature and evolution of informational efficiency in an emerging market context. Each essay uses firm and aggregate level data from an emerging market, namely the Trinidad and Tobago Stock Exchange (TISE). The first essay explores the state of informational efficiency in an emerging market setting and how this efficiency evolves over time. It addresses a key issue in the emerging markets informational efficiency literature: informational efficiency is not a static feature of markets but evolves over time with changes in market features and the investor behaviour. The analysis initially applies a battery of econometric tests of the random walk (variance ratio and signed-rank tests) to the full sample returns of aggregate and stock level data from the TISE, to determine whether the market is informational efficient. The findings show that the market is informationally inefficient at the aggregate level, which is accounted for by the severe lack of efficiency in the bulk of stocks traded on the exchange. This result could imply that by and large, stock prices in an emerging market setting may not be accurate signals for resource allocation. The next part of the analysis considers whether this state of informational efficiency varies over time, by applying the more robust signed-rank test in a rolling sub-sample framework to the stock and aggregate level data. The analysis shows that over time, there are transient periods of informational efficiency, which alternate with periods of inefficiency at the aggregate market level. This pattern of time-varying efficiency is largely mirrored by the Banking stocks of the TISE. Such results could mean that informational efficiency in an emerging market setting may improve in some periods but worsen in others. This does not conform to the classical Efficient Markets Hypothesis, which claims that markets should show a clear trend toward higher states of efficiency over time. The second essay analyses the effects of several market microstructures and financial reforms on time-varying informational efficiency in an emerging market setting. It uses data from the TISE, which is measured at the firm level for the microstructure variables. This allows for the analysis to extract the precise effects of microstructures on efficiency, which may otherwise be hidden by aggregate level data. The analysis is done within a panel regression framework. We find that improvements in the microstructure variables, including liquidity, volatility, automation and the number of shareholders enhance informational efficiency over time. However, the financial reforms, including financial liberalisation and regulation, do not alter efficiency in this analysis. We further find that the liquidity and total shareholders of the banking firms have a greater impact on efficiency, in relation to the other listed firms. Taken together, the results could imply that market microstructures playa more important role in causing informational efficiency in an emerging market setting to evolve over time. The third essay explores price predictability from a lead-lag cross autocorrelation perspective. In particular, it considers whether the degree of institutional investment across firms induces lead-lag cross-autocorrelation among stocks. The analysis is conducted by applying returns data at the firm level from the TISE in a Vector Autoregressive (VAR) framework. It is found that the institutionally favoured stocks (HI) "lead" the institutionally unfavoured stocks (LO), as the returns of HI predict the returns of LO better than vice versa. Moreover, HI stocks are also found to lead the TISE market portfolio. These patterns of predictability arise because the stock prices of high institutionally owned firms adjust faster to market-wide information, which implies that these prices are more informationally efficient. It is also found that the extent to which HI leads LO increases with the liquidity of the institutionally favoured stocks and the illiquidity of stocks that are not favoured by institutions. Collectively, the results of this essay provide evidence of the positive role played by institutional investors in the price adjustment process: stocks with a high degree of institutional ownership have a faster speed of adjustment, making them more informationally efficient.
APA, Harvard, Vancouver, ISO, and other styles
8

Braun, Johannes [Verfasser], and Wolfgang [Akademischer Betreuer] Schäfers. "Essays on Informational Efficiency in Real Estate Markets / Johannes Braun ; Betreuer: Wolfgang Schäfers." Regensburg : Universitätsbibliothek Regensburg, 2021. http://d-nb.info/1225935725/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Lei. "Two essays : on the common information in the return volatilities and volumes : on the informational efficiency of municipal bond market." Related electronic resource: Current Research at SU : database of SU dissertations, recent titles available, full text:, 2008. http://wwwlib.umi.com/cr/syr/main.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wouassom, Alain. "Momentum and Contrarian trading strategies : implication for risk-sharing and informational efficiency of security markets." Thesis, Queen Mary, University of London, 2017. http://qmro.qmul.ac.uk/xmlui/handle/123456789/24859.

Full text
Abstract:
This thesis investigates the profitability of the Momentum and Contrarian strategies in international equity markets. In particular, I introduce for the first time the use of countries' indices performance to momentum and contrarian portfolio selection. I show that investors can switch back and forth from one country to the other in designing worldwide strategies. The global momentum strategy is consistently profitable between 1969 and 2014. The most successful momentum strategy selects stocks based on their previous performances over 9 months and then holds the portfolio for the next 3 months. This strategy yields 3% per month (42.57% per year). Interestingly, countries' indices' portfolios formed based on prior 48 months; prior losers outperform prior winners by 0.83% per month (10.40% per year) during the subsequent 60 months. The reversal effect is substantially stronger for emerging countries where it yields 1.37% per month (17.70% per year). It remains profitable in the period post-globalization. In addition, I examine for the first time the role of world risks factors in explaining the global momentum and contrarian profits and find that the global momentum strategies obtain significant abnormal returns after adjusting consecutively for world Fama and French risks (0.9% per month or 11.35% per year), and world market states risks (1.31% per month or 16.76% year). Of particular interest, I find a strong relation between world macroeconomic risks factors, notably world industrial production and the momentum return. Second, I find no substantial relation between world risks factors and the contrarian profit. These results suggest that excess return can be earned in the long run by using global investment strategies based on historical prices, challenging the weak form of the Efficient Market Hypothesis. In Chapter 1, I explain the momentum and the contrarian strategies, motivate the importance of what I propose as global momentum and contrarian strategies, and present the results obtained. In chapter 2, I review the Efficient Market Hypothesis' literatures in conformity with the Standard Finance theory. Additionally, I review the Behavioural Finance literatures with a focus on the psychology of investor decision, and the stock market under-reaction and overreaction approach of explaining the momentum and contrarian profitability. In chapter 3, I explain in details the main methodologies used to examine the global momentum and contrarian strategies profitability, and motivate the dataset used. In Chapter 4, I examine the new global momentum strategy profitability internationally. In Chapter 5, I examine the new contrarian strategy profitability internationally. In Chapter 6 I examine the role of global risks factors in explaining the momentum and contrarian profits. Finally, in Chapter 7 I conclude and highlights the limitations of the thesis.
APA, Harvard, Vancouver, ISO, and other styles
11

Renault, Thomas. "Three essays on the informational efficiency of financial markets through the use of Big Data Analytics." Thesis, Paris 1, 2017. http://www.theses.fr/2017PA01E009/document.

Full text
Abstract:
L’augmentation massive du volume de données générées chaque jour par les individus sur Internet offre aux chercheurs la possibilité d’aborder la question de la prédictibilité des marchés financiers sous un nouvel angle. Sans prétendre apporter une réponse définitive au débat entre les partisans de l’efficience des marchés et les chercheurs en finance comportementale, cette thèse vise à améliorer notre compréhension du processus de formation des prix sur les marchés financiers grâce à une approche Big Data. Plus précisément, cette thèse porte sur (1) la mesure du sentiment des investisseurs à fréquence intra-journalière, et le lien entre le sentiment des investisseurs et les rendements agrégés du marché,(2) la mesure de l’attention des investisseurs aux informations économiques et financières en temps réel, et la relation entre l’attention des investisseurs et la dynamique des prix des actions des sociétés à forte capitalisation, et enfin, (3) la détection des comportements suspicieux pouvant amoindrir le rôle informationnel des marchés financiers, et le lien entre le volume d’activité sur les réseaux sociaux et le prix des actions des entreprises de petite capitalisation. Le premier essai propose une méthodologie permettant de construire un nouvel indicateur du sentiment des investisseurs en analysant le contenu des messages publiés sur le réseau social Stock-Twits. En examinant les caractéristiques propres à chaque utilisateur (niveau d’expérience, approche d’investissement, période de détention), cet essai fournit des preuves empiriques montrant que le comportement des investisseurs naïfs, sujets à des périodes d’excès d’optimisme ou de pessimisme, a un impact sur la valorisation du marché action, et ce en accord avec les théories de la finance comportementale. Le deuxième essai propose une méthodologie permettant de mesurer l’attention des investisseurs aux informations en temps réel, en combinant les données des médias traditionnels avec le contenu des messages envoyés par une liste d’experts sur la plateforme Twitter. Cet essai démontre que lorsqu’une information attire l’attention des investisseurs, les mouvements de marchés sont caractérisés par une forte hausse des volumes échangés, une hausse de la volatilité et des sauts de prix. Cet essai démontre également qu’il n’y a pas de fuite d’information significative lorsque les sources d’informations sont combinées pour corriger un potentiel problème d’horodatage. Le troisième essai étudie le risque de manipulation informationnelle en examinant un nouveau jeu de données de messages publiés sur Twitter à propos des entreprises de petite capitalisation. Cet essai propose une nouvelle méthodologie permettant d’identifier les comportements anormaux de manière automatisée en analysant les interactions entre les utilisateurs. Étant donné le grand nombre de recommandations suspicieuses d’achat envoyées par certains groupes d’utilisateurs, l’analyse empirique et les conclusions de cet essai soulignent la nécessité d’un plus grand contrôle par les régulateurs de l’information publiée sur les réseaux sociaux ainsi que l’utilité d’une meilleure éducation des investisseurs individuels
The massive increase in the availability of data generated everyday by individuals on the Internet has made it possible to address the predictability of financial markets from a different perspective. Without making the claim of offering a definitive answer to a debate that has persisted for forty years between partisans of the efficient market hypothesis and behavioral finance academics, this dissertation aims to improve our understanding of the price formation process in financial markets through the use of Big Data analytics. More precisely, it analyzes: (1) how to measure intraday investor sentiment and determine the relation between investor sentiment and aggregate market returns, (2) how to measure investor attention to news in real time, and identify the relation between investor attention and the price dynamics of large capitalization stocks, and (3) how to detect suspicious behaviors that could undermine the in-formational role of financial markets, and determine the relation between the level of posting activity on social media and small-capitalization stock returns. The first essay proposes a methodology to construct a novel indicator of investor sentiment by analyzing an extensive dataset of user-generated content published on the social media platform Stock-Twits. Examining users’ self-reported trading characteristics, the essay provides empirical evidence of sentiment-driven noise trading at the intraday level, consistent with behavioral finance theories. The second essay proposes a methodology to measure investor attention to news in real-time by combining data from traditional newswires with the content published by experts on the social media platform Twitter. The essay demonstrates that news that garners high attention leads to large and persistent change in trading activity, volatility, and price jumps. It also demonstrates that the pre-announcement effect is reduced when corrected newswire timestamps are considered. The third essay provides new insights into the empirical literature on small capitalization stocks market manipulation by examining a novel dataset of messages published on the social media plat-form Twitter. The essay proposes a novel methodology to identify suspicious behaviors by analyzing interactions between users and provide empirical evidence of suspicious stock recommendations on social media that could be related to market manipulation. The conclusion of the essay should rein-force regulators’ efforts to better control social media and highlights the need for a better education of individual investors
APA, Harvard, Vancouver, ISO, and other styles
12

Hassan, Mahamood Mahomed. "Testing the pricing and informational efficiency of the S&P 500 stock index futures market." Diss., The University of Arizona, 1989. http://hdl.handle.net/10150/184858.

Full text
Abstract:
Three empirical studies are conducted examining the efficiency of S&P 500 futures prices and the pricing of these futures contracts. In the first study, the ability of futures prices to predict the realized spot S&P 500 index prices on the expiration date is examined for near term contracts. The futures prices are found to be unbiased predictors of the realized spot index prices for the nineteen quarterly contracts from 1982 to 1986. Previous studies report significant deviations in S&P SOO futures prices from theoretically determined Cost of Carry Model (CCM) prices. In the second study, it is found that the CCM using the federal funds rate, a proxy for the overnight repurchase rate, provides relatively better estimates of the S&P S(x) futures prices over the 1984-1986 period. The futures mispricing also reflects the weekend effect anomaly: futures prices are "over-priced" relative to CCM prices on Mondays, whereas the opposite occurs on Fridays. The futures over-pricing (under-pricing) is characterized by "bull" ("bear") financial markets and the extent of price changes are relatively greater in the futures market. The futures under-pricing is supported by strong future market volume and open-interest positions. The basis and changes in it over the futures contract period are measures of how well integrated the futures market and the underlying spot market are. In the third study, based on daily closing prices for the S&P 500 index and index futures for the 1984-1986 period, it is found that the basis decreases over the contract period but the rate of decrease is independent of the time to expiration. The change in basis on Mondays is generally positive which also reflects the weekend effect anomaly. The daily basis is negative on 107 days, which generally occurs during strong futures market trading volume and open interest positions. It is doubtful whether the negative basis can be attributed to a negative net financing cost, where the dividend yield 0.1 the spot index exceeds the cost of financing the spot index forward.
APA, Harvard, Vancouver, ISO, and other styles
13

Lo, Chun Yin. "The Effect of Institutional Shareholding on the Informational Efficiency of Stock Prices: Evidence from the Hang Seng Index." Scholarship @ Claremont, 2015. http://scholarship.claremont.edu/cmc_theses/1029.

Full text
Abstract:
This paper uses survey data by the Hong Kong Stock Exchange (HKEx) from 1991-2013 to test the role that institutional ownership has on the relative informational efficiency of stock prices in the Hang Seng Index, using the R2 of stock prices as a measurement of efficiency. This paper finds that on the aggregate level, the presence of institutional ownership is positively associated with R2, reflecting a negative effect on the level of information incorporated into stock prices. However, in isolating foreign institutions, the relationship with R2 reverses, and I find a positive correlation with the informational efficiency of stock prices. Moreover, this paper finds that a period characterized by high growth in institutional shareholding does not necessarily correspond to a greater level of improvement in the informational environment of stock markets. The results however, lack significance, perhaps due to the shortcomings of the survey data which is limited to 21 annual observations when incorporating a t-1 year lag. With more observations we would expect a substantial increase in the significance of the coefficient on our explanatory variables.
APA, Harvard, Vancouver, ISO, and other styles
14

Samkange, Edgar. "An investigation of the informational efficiency of the Johannesburg Stock Exchange with respect to monetary policy (2000-2009)." Thesis, University of Fort Hare, 2010. http://hdl.handle.net/10353/324.

Full text
Abstract:
This study aims to investigate the informational efficiency of the Johannesburg Stock Exchange with respect to monetary policy. Multivariate co-integration, Granger causality, vector error correction model, impulse response function analysis and variance decomposition analysis are employed to determine the semi-strong form efficiency in South African equity market. Monthly data of Johannesburg Stock Exchange index, money supply (M1 & M2), short term interest rate, inflation, rand/dollar exchange rate, London Stock Exchange index (FSTE100) and GDP from 2000-2009 are the variables of interest.Weak form efficiency is examined using unit root tests. The results of this study show evidence of weak form efficiency of the JSE using the Augmented-Dickey Fuller and Philip-Perron unit root tests. The results reject the hypothesis that the JSE is semi-strong and have important implications for government policy, regulatory authorities and participants in the South African stock market.
APA, Harvard, Vancouver, ISO, and other styles
15

Shang, Danjue. "Option Markets and Stock Return Predictability." Diss., The University of Arizona, 2016. http://hdl.handle.net/10150/613277.

Full text
Abstract:
I investigate the information content in the implied volatility spread, which is the spread in implied volatilities between a pair of call and put options with the same strike price and time-to-maturity. By constructing the implied volatility time series for each stock, I show that stocks with larger implied volatility spreads tend to have higher future returns during 2003-2013. I also find that even volatilities implied from untraded options contain such information about future stock performance. The trading strategy based on the information contained in the actively traded options does not necessarily outperform its counterpart derived from the untraded options. This is inconsistent with the previous research suggesting that the information contained in the implied volatility spread largely results from the price pressure induced by informed trading in option markets. Further analysis suggests that option illiquidity is associated with the implied volatility spread, and the magnitude of this spread contains information about the risk-neutral distribution of the underlying stock return. A larger spread is associated with smaller risk-neutral variance, more negative risk-neutral skewness, and seemingly larger risk-neutral kurtosis, and this association is primarily driven by the systematic components in risk-neutral higher moments. I design a calibration study which reveals that the non-normality of the underlying risk-neutral return distribution relative to the Brownian motion can give rise to the implied volatility spread through the channel of early exercise premium.
APA, Harvard, Vancouver, ISO, and other styles
16

Frisell, Lars. "Information and politics." Doctoral thesis, Stockholm : Economic Research Institute, Stockholm School of Economics (Ekonomiska forskningsinstitutet vid Handelshögsk.) (EFI), 2001. http://www.hhs.se/efi/summary/577.htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
17

Fernández, Bariviera Aurelio. "Ensayos sobre la E ciencia Informativa del Mercado de Capitales." Doctoral thesis, Universitat Rovira i Virgili, 2015. http://hdl.handle.net/10803/308667.

Full text
Abstract:
La Hipòtesi del Mercat Eficient (HME) és un dels pilars de l'economia financera. Diem que un mercat financer és informativament eficient si els preus reflecteixen tota la informació disponible en un determinat moment. Tot i les diverses dècades d'investigació sobre la HME, encara queden aspectes sobre els quals no s'ha arribat a un consens en la literatura. Per això, abordem aquest estudi des d'una perspectiva nova en tres aspectes. En primer lloc, assumint el caràcter dinàmic de l'eficiència informativa, la estudiem mitjançant finestres mòbils per veure la seva evolució en el temps. En segon lloc, introduïm tècniques estadístiques no utilitzades habitualment en economia financera. En tercer lloc, relacionem els nivells d'eficiència informativa amb determinades variables econòmiques, amb l'objecte de veure la seva interacció. El capítol 1 proveeix una introducció al tema i detalla l'estructura de la tesi. En el capítol 2 s'estableix el marc teòric i es realitza una detallada descripció de l'evolució i els tests empírics duts a terme sobre la HME. El capítol 3 es compon de 4 assaigs que estudien, mitjançant tècniques estadístiques avançades, diferents aspectes sobre la HME com són la memòria de llarg termini, el caràcter variable de l'eficiència informativa i la seva relació amb determinades variables econòmiques. Finalment, el capítol 4 proporciona les principals conclusions.
La Hipótesis del Mercado Eficiente (HME) es uno de los pilares de la economía financiera. Decimos que un mercado financiero es informativamente eficiente si los precios reflejan toda la informacion disponible en un determinado momento. A pesar de las varias décadas de investigación sobre la HME, todavía quedan aspectos sobre los cuales no se ha llegado a un consenso en la literatura. Por ello, abordamos este estudio desde una perspectiva novedosa en tres aspectos. En primer lugar, asumiendo el carácter dinámico de la eficiencia informativa, estudiamos la misma mediante ventanas móviles para ver su evolución en el tiempo. En segundo lugar, introducimos técnicas estadísticas no utilizadas habitualmente en economía financiera. En tercer lugar, relacionamos los niveles de eficiencia informativa con determinadas variables económicas, con el objeto de ver su interacción. El capítulo 1 provee una introducción al tema y detalla la estructura de la tesis. En el capítulo 2 se establece el marco teórico y se realiza una pormenorizada descripcion de la evolución y los tests empíricos llevados a cabo sobre la HME. El capítulo 3 se compone de 4 ensayos que estudian mediante técnicas estadísticas avanzadas diferentes aspectos sobre la HME, como son la memoria de largo plazo, el carácter variable de la eficiencia informativa y su relación con determinadas variables economicas. Finalmente el capítulo 4 proporciona las principales conclusiones.
The Efficient Market Hypothesis (EMH) is one of the pillars of the financial economy. We say that a financial market is informationally efficient if the prices reflect all available information at a given time. Despite several decades of research on EMH, there are still issues on which no consensus has been reached in the literature. Therefore, we approach this study from a new perspective in three respects. First, assuming the dynamic nature of information efficiency, we study it by sliding windows to observe their evolution in time. Secondly, we introduce statistical techniques not commonly used in financial economics. Third, we relate information efficiency levels with certain economic variables, in order to see their interaction. Chapter 1 provides an introduction to the topic and details the structure of the thesis. Chapter 2 provides the theoretical framework and a detailed description of the evolution and empirical tests carried out on the EMH is done. Chapter 3 consists of 4 essays which, using advanced statistical techniques different aspects of the EMH, such as long-term memory, the variable nature of the information efficiency and its relation to certain economic variables. Finally, Chapter 4 provides the main conclusions.
APA, Harvard, Vancouver, ISO, and other styles
18

Filbien, Jean-Yves. "Essais sur les fusions et acquisitions." Thesis, Lille 2, 2010. http://www.theses.fr/2010LIL20006.

Full text
Abstract:
Après avoir proposé un cadre général des opérations de fusion‐acquisition, notamment en tant que processus créateur de valeur, nous examinons empiriquement les effets de leur annonce au travers de trois essais. Premièrement, nous étudions les réactions intra‐journalières des investisseurs lors d’annonces diffusées aux États‐Unis. Nous trouvons des gains pour la cible, alors que l’impact s'avère neutre pour l’acquéreur. Ces résultats se déroulent dans une activité de marché soutenue, conjointement à une amélioration de la liquidité. Deuxièmement, nous étendons l’analyse aux concurrents des acquéreurs et des cibles. Considérant le marché canadien, la diffusion de l’information affecte négativement leurs rivaux. Troisièmement, nous soulignons les conditions sous lesquelles les dirigeants écoutent les marchés financiers. Analysant un échantillon d’acquisitions françaises, nous montrons que les dirigeants connectés sont plus enclins à réaliser une acquisition en dépit d’une réaction négative des marchés financiers lors de l’annonce
After providing a general framework of merger‐acquisition, in particular as value‐creating process, we examine empirically the effects of their announcement through three essays. First, we study the intraday market reactions to announcements in the United‐States. We find gains for target firms, while acquiring firms do not earn significant abnormal returns. These results occur in a high trading activity and animprovement of liquidity. Second, we extend the analysis to the competitors of merging firms. Considering the Canadian evidence, the release of information affects negatively their rivals. Third, we study the conditions under which managers are more willing to listen to investors. Analyzing a sample of French acquisitions, we find that the well‐connected managers are more likely to complete a deal in spite of a negative market reaction on acquisition announcement
APA, Harvard, Vancouver, ISO, and other styles
19

Barraud, Christophe. "L'Efficience informationnelle du marché des paris sportifs : un parallèle avec les marchés boursiers." Phd thesis, Université Paris Dauphine - Paris IX, 2012. http://tel.archives-ouvertes.fr/tel-00834768.

Full text
Abstract:
Cette thèse a pour but de présenter le marché des paris sportifs plus précisément de montrer en quoi ce dernier constitue un cadre d'observations simplifié suffisamment proche des marchés boursiers pour tester la théorie de l'efficience informationnelle et aboutir à des conclusions unanimes concernant sa validité empirique. En premier lieu, nous concentrons notre attention sur la forme faible de l'efficience informationnelle et plus précisément sur une anomalie connue sous le nom du Favourite Longshot Bias, qui a été recensée aussi bien dans le cadre des paris sportifs que celui des marchés boursiers. A l'aide d'un vaste échantillon de données, nous démontrons que les coûts de transaction et les préférences des parieurs ont un impact significatif sur le niveau des cotes proposées par les bookmakers et donc sur la structure des prix. Par ailleurs, nous discutons de la rationalité des parieurs et nous montrons en quoi le comportement des parieurs n'est pas si différent de celui des investisseurs sur les marchés boursiers. En second lieu, nous analysons en détails la forme forte de l'efficience informationnelle et plus précisément la pertinence de la fourchette en tant qu'indicateur de délits d'initiés dans le cadre des paris sportifs.
APA, Harvard, Vancouver, ISO, and other styles
20

Tvrdíková, Alena. "Nové trendy v logistice uplatňované v mezinárodním obchodě." Master's thesis, Vysoká škola ekonomická v Praze, 2011. http://www.nusl.cz/ntk/nusl-112778.

Full text
Abstract:
The author highlights the important role of logistics and its application to international trade. To succeed in business it is crucial to reveal new logistics trends in order to adapt to the changes as soon as possible. The main objective of the thesis is therefore providing the analysis of current logistics trends and pointing out the factors that will have a major impact on the future direction of logistics. At the end there are case studies on companies Škoda Auto and Wal-Mart to familiarize with new logistics approaches in practice.
APA, Harvard, Vancouver, ISO, and other styles
21

Verbeek, Erwin [Verfasser]. "Information Efficiency in Speculative Markets / Erwin Verbeek." Aachen : Shaker, 2011. http://d-nb.info/1084535610/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
22

McLafferty, Kevin. "Operational efficiency of industrialised information processing systems." Thesis, Cardiff Metropolitan University, 2016. http://hdl.handle.net/10369/8253.

Full text
Abstract:
The British economy has always been a trading nation in terms of goods and more recently services. At the heart of the nation and international trading is London, the hub of a global financial empire that unites the globe on a 24-hour basis. Vast revenues are generated by commercial and investment banking institutions, yet research in this sector has been comparatively low. Management researchers have instead gravitated towards the ‘back office’ operations of high street banks or general insurance company call centres (Seddon, 2008). Research has focused on repetitive clerical activities for customers, and how these businesses suffer from ‘failure demand’ and/or ‘demand amplification’ (Forrester 1961) created when a customer is forced to re-establish contact with a call centre to have their issue/concerns reworked (when it should have been ‘right first time’). Modern commercial and investment banks do not share the repetitive and relatively predictable transactions of call centres and instead, represent extreme operations management cases. The workload placed upon commercial and investment banking systems is incredibly high volume, high value and high variety in terms of what clients’ demand and how ‘the product’ (trades) is executed. At this period in time, the financial collapse of 2008 is still shaping working practices due punitive regulatory environment. Many UK banks are now part-owned by the government, there is social and political pressure to stimulate improvement in banking operations which – it is thought – will herald the return of the banks back to private ownership. This thesis addresses the flow of global “trades” through the operations office and explores how the design and fit of the sociotechnical environment provides effective and efficient trade flow performance. The key research questions emerging from the literature review establishing the gap in knowledge are 1) How efficient are commercial and investment banking trading processing systems? And 2) What are the enablers and inhibitors of efficient and high performance of industrialised processing systems? xviii To answer these questions, the researcher undertook an in-depth and longitudinal case study whilst at a British bank that was ‘benchmarked’ as underperforming against its peers (MGT Report1, 2011). The case study strategy was executed using an action research and reflective learning approach (cycles of research) to explore the performance and improvement of banking operations management performance. The findings show that, using systems feedback, the management at the bank were able to develop into a “learning organisation” (Senge 1990) and improve and enhances the flow of work through the system. The study has resulted in significant gains for the case study and a new model of Rolled Throughput Yield is presented that rests on the key concept of “Information Fidelity”. This work marks a contribution to the operations management body of knowledge by exploring “flow” under conditions of high volume and high variety and from within the under-researched context of commercial and investment banking. 1 “MGT” is an anonymised commercial and investment banking industry report into operational efficiency and cost performance. The report was commissioned by the participant banks and conducted by “MGT Consultants” and is considered highly confidential. The researcher was given a copy of the report while working with the case, forming as the catalyst for the research into operational performance. The researcher was unable to receive “MGT Consultants” agreement to ‘directly’ cite the report as part of this study.
APA, Harvard, Vancouver, ISO, and other styles
23

Malek, Behzad. "Efficient private information retrieval." Thesis, University of Ottawa (Canada), 2005. http://hdl.handle.net/10393/26966.

Full text
Abstract:
In this thesis, we study Private Information Retrieval and Oblivious Transfer, two strong cryptographic tools that are widely used in various security-related applications, such as private data-mining schemes and secure function evaluation protocols. The first non-interactive, secure dot-product protocol, widely used in private data-mining schemes, is proposed based on trace functions over finite fields. We further improve the communication overhead of the best, previously known Oblivious Transfer protocol from O ((log(n))2) to O (log(n)), where n is the size of the database. Our communication-efficient Oblivious Transfer protocol is a non-interactive, single-database scheme that is generally built on Homomorphic Encryption Functions. We also introduce a new protocol that reduces the computational overhead of Private Information Retrieval protocols. This protocol is shown to be computationally secure for users, depending on the security of McEliece public-key cryptosystem. The total online computational overhead is the same as the case where no privacy is required. The computation-saving protocol can be implemented entirely in software, without any need for installing a secure piece of hardware, or replicating the database among servers.
APA, Harvard, Vancouver, ISO, and other styles
24

Bazargan-Lari, Sina. "Benefits of information technology in improving preconstruction efficiency." [Gainesville, Fla.] : University of Florida, 2009. http://purl.fcla.edu/fcla/etd/UFE0041262.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Verney, Steven P. "Pupillary responses index : information processing efficiency across cultures /." Diss., Connect to a 24 p. preview or request complete full text in PDF format. Access restricted to UC campuses, 2000. http://wwwlib.umi.com/cr/ucsd/fullcit?p9992386.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Rydberg, Christoffer. "Time Efficiency of Information Retrieval with Geographic Filtering." Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2015. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-172918.

Full text
Abstract:
This study addresses the question of time efficiency of two major models within Information Retrieval (IR): the Extended Boolean Model (EBM) and the Vector Space Model (VSM). Both models use the same weighting scheme, based on term-frequency-inverse document frequency (tf-idf). The VSM uses a cosine score computation to rank the document-query similarity. In the EBM, P-norm scores are used, which ranks documents not just by matching terms, but also by taking the Boolean interconnections between the terms in the query into account. Additionally, this study investigates how documents with a single geographic affiliation can be retrieved based on features such as the location and geometry of the geographic surface. Furthermore, we want to answer how to best integrate this geographic search with the two IR-models previously described. From previous research we conclude that using an index based on Z-Space Filling Curves (Z-SFC) is the best approach for documents containing a single geographic affiliation. When documents are retrieved from the Z-SFC-index, there are no guarantees that the retrieved documents are relevant for the search area. It is, however, guaranteed that only the retrieved documents can be relevant. Furthermore, the ranked output of the IR models gives a great advantage to the geographic search, namely that we can focus on documents with a high relevance. We intersect the results from one of the IR models with the results from the Z-SFC index and sort the resulting list of documents by relevance. At this point we can iterate over the list, check for intersections of each document's geometry and the search geometry, and only retrieve documents whose geometries are relevant for the search. Since the user is only interested in the top results we can stop as soon as a sufficient amount of results have been obtained. The conclusion of this study is that the VSM is an easy-to-implement, time efficient, retrieval model. It is inferior to the EBM in the sense that it is a rather simple bag-of-words model, while the EBM allows to specify term- conjunctions and disjunctions. The geographic search has shown to be time efficient and independent of which of the two IR models that is used. The gap in efficiency between the VSM and the EBM, however, drastically increases as the query gets longer and more results are obtained. Depending on the requirements of the user, the collection size, the length of queries, etc., the benefits of the EBM might outweigh the downside of performance. For search engines with a big document collection and many users, however, it is likely to be too slow.
Den här studien addresserar tidseffektiviteten av två större modeller inom informationssökning: ”Extended Boolean Model” (EBM) och ”Vector Space Model” (VSM) . Båda modellerna använder samma typ av viktningsschema, som bygger på ”term frequency–inverse document frequency“ (tf- idf). I VSM rankas varje dokument, utifrån en söksträng, genom en skalärprodukt av dokumentets och söksträngens vektorrepresentationer. I EBM används såkallade ”p-norm score functions” som rankar dokument, inte bara utifrån matchande termer, utan genom att ta hänsyn till de Booleska sammanbindningar som finns mellan sökorden. Utöver detta undersöker studien hur dokument med en geografisk anknytning kan hämtas baserat på positionen och geometrin av den geografiska ytan. Vidare vill vi besvara hur denna geografiska sökning på bästa sätt kan integreras med de två informationssökningmodellerna. Utifrån tidigare forskning dras slutsatsen att det bästa tillvägagångssättet för dokument med endast en geografisk anknytning är att använda ett index baserat på ”Z-Space Filling Curves” (Z-SFC). När dokument hämtas genom Z-SFC-indexet finns det inga garantier att de hämtade dokumenten är relevanta för sökytan. Det är däremot garanterat att endast dessa dokument kan vara relevanta. Vidare är det rankade utdatat från IR-modellerna till en stor fördel för den geografiska sökningen, nämligen att vi kan fokusera på dokument med hög relevans. Detta görs genom att jämföra resultaten från vald IR-modell med resultaten från Z-SFC-indexet och sortera de matchande dokumenten efter relevans. Därefter kan vi iterera över listan och beräkna vilka dokuments geometrier som skär sökningens geometri. Eftersom användaren endast är intresserad av de högst rankade dokumenten kan vi avbryta när vi har tillräckligt många sökresultat. Slutsatsen av studien är att VSM är enkel att implementera och mycket tidseffektiv jämfört med EBM. Modellen är underlägsen EBM i den mening att det är en ganska enkel ”bag of words”-modell, medan EBM tillåter specificering av konjuktioner och disjunktioner. Den geografiska sökningen har visats vara tidseffektiv och oberoende av vilken av de två IR-modellerna som används.Skillnaden i tidseffektivitet mellan VSM och EBM ökar däremot drastiskt när söksträngen blir längre och fler resultat erhålls. Emellertid, beroende på användarens krav, storleken på dokumentsamlingen, söksträngens längd, etc., kan fördelarna med EBM ibland överväga nackdelen av den lägre prestandan. För sökmotorer med stora dokumentsamlingar och många användare är dock modellen sannolikt för långsam.
APA, Harvard, Vancouver, ISO, and other styles
27

Giouvris, Evangelos Thomas. "Issues in asset pricing, liquidity, information efficiency, asymmetric information and trading systems." Thesis, Durham University, 2006. http://etheses.dur.ac.uk/2940/.

Full text
Abstract:
Market microstructure is a relatively new area in finance which emerged as a result of inconsistency between actual and expected prices due to a variety of frictions (mainly trading frictions and asymmetric information) and the realisation that the trading process through which investors' demand is ultimately translated into orders and volumes is of greater importance in price formation than it was originally thought. Despite increased research in the area of liquidity, asset pricing, asymmetric information and trading systems, all subfields in the area of market microstructure, there are a number of questions that remain unanswered such as the effect of different trading systems on systematic liquidity, informational efficiency or components of the spread. This thesis aims at shedding light on those questions by providing a detailed empirical investigation of the effect of trading systems on systematic liquidity, pricing, informational efficiency, volatility and bid-ask spread decomposition mainly with respect to the UK market (FTSEIOO and FTSE250) and to a less extent with respect to the Greek market. Those two markets are at different levels of development/sophistication and are negatively correlated.The aims of this thesis are outlined in chapter one with chapter two providing a detailed review of the theoretical literature relevant to this study. Chapter three is the first empirical chapter and tests for the presence of a common underlying liquidity factor (systematic liquidity) and its effect on pricing for FTSE100 and FTSE250 stocks under different trading regimes. Results show the presence of commonality for FTSE100 and FTSE250 stocks although commonality is weaker for FTSE250 stocks and its role on pricing is reduced. Chapter four investigates the same issues with respect to the Greek market and we find that commonality appears to be stronger in some periods while it is reduced to zero for other periods. Chapter five focuses on the effect that changes in the trading systems can have on informational efficiency and volatility primarily with respect to FTSE100 and FTSE250. Different methodologies and data are employed for this purpose and produce similar results. We find that order driven markets are more responsive to incoming information when compared to quote driven markets. Volatility has a greater impact on the spread when the market is quote driven. We also examined if automated trading increased informational efficiency with respect to the Greek market. The results obtained indicated that the effect of automation was positive. Finally the last chapter focused on the effect of different trading systems on the components of the spread and their determinants. Our main finding is that the asymmetric component of the spread is higher under a quote driven market. Also stock volatility appears to affect the asymmetric component to a greater extent when the market is quote driven. We believe that the main justification for those findings is affirmative quotation.
APA, Harvard, Vancouver, ISO, and other styles
28

Malhotra, Suvarcha. "Information capacity and power efficiency in operational transconductance amplifiers." College Park, Md. : University of Maryland, 2003. http://hdl.handle.net/1903/104.

Full text
Abstract:
Thesis (M.S.) -- University of Maryland, College Park, 2003.
Thesis research directed by: Dept. of Electrical and Computer Engineering. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
APA, Harvard, Vancouver, ISO, and other styles
29

Cheah, Eng-Tuck. "The implications of information processing efficiency on decision making." Thesis, University of Southampton, 2011. https://eprints.soton.ac.uk/184037/.

Full text
Abstract:
This thesis investigates the implications of information processing efficiency on decision making with respect to the ability of decision makers to process information in a rational and timely manner. In order to examine the different aspects of information efficiency with respect to decision making, three different settings were used. First, attitudes and perceptions held by individual decision makers play an important role in the information processing stage of a decision. Therefore, the first thrust of this thesis seeks to investigate the impact of demographic characteristics of decision makers (socially responsible investors (SRIs)) on their attitudes and perceptions (in relation to their corporate social responsibility (CSR) views). The results show that demographic characteristics are useful predictors of CSR views held by SRIs. This implies that companies can reduce their cost of capital by attracting the affluent members of SRIs community and increase their CSR rankings by creating diversity in their corporate boardrooms. These efforts, if undertaken by companies, can help increase share price of the respective companies. Government agencies can also encourage companies to implement CSR agendas by requiring companies to implement CSR agendas which will appeal to the specific members in the SRIs community (clientele effect). Second, the ability of decision makers to process information in a rational manner can be seriously undermined when decision makers are expected to match the different motivations underlying their own or others‟ objectives with the multiple choices which are available to them. In the second thrust of the thesis, a state contingent (UK horseracing pari-mutuel betting market) with multi-competitor choices is used to illustrate the discovery of the determinants of demand (day-of-the-week, weekend, public holiday, number of races in the same hour, field size, televised races, flat and jump races, race quality, timing of the race during the day, insider trading, track conditions, bookmakers‟ over-round and risk attitude of bettors) unique to different groups of decision makers (bettors). The results demonstrate that unique sets of determinants can be used to identify the different types of decision makers (that is, sophisticated and unsophisticated bettors). Clearly, the discovery of these unique determinants for demand can be used by the respective authorities (British Horseracing Board, Horseracing Betting Levy and Tote boards) in deciding which variables are important to influence the behavior of the respective decision makers (bettors and horseracing authorities). Third, decision makers ought to be able to arrive at a decision in a timely manner. The third thrust of this thesis attempts to investigate the speed of adjustment with respect to the arrival of new and unexpected information in understanding the financial integration process in the Asia Pacific region (APR). Using stock market capitalization as a measure of equity market size, it was also found that more advanced equity markets are more informationally efficient that those less advanced equity markets possibly due to the fact that the infrastructure which supports information flow enables information to be easily accessible by investors for decision making. The results suggest that a more integrated equity market in the APR can lead to a greater speed of adjustment with respect to information shocks. Therefore, domestic governments have a role to play in ensuring the necessary infrastructure to facilitate information flow is improved and better integrated with neighbouring equity markets. Finally, the thesis concludes that demographic characteristics play an important role in influencing the rational information processing involved in decision making by individuals. When confronted with choices, decision makers are affected by their various motivations and those who seek to capitalise on others‟ decisions need to be aware of these motivations. In addition, the infrastructure on which information flows is essential in influencing the speed at which information is processed
APA, Harvard, Vancouver, ISO, and other styles
30

Wang, Yong. "Diversification, information asymmetry, cost of capital, and production efficiency." Diss., Temple University Libraries, 2008. http://cdm16002.contentdm.oclc.org/cdm/ref/collection/p245801coll10/id/13948.

Full text
Abstract:
Business Administration
Ph.D.
This study examines how diversification changes firms' key characteristics, which consequently alter firms' value. The reason why I focus on this topic is because of the mixed findings in literature about the valuation effect of diversification. This study offers deeper insights to the influence of diversification on important valuation factors that are already identified in finance literature. Specifically, it examines if diversification affects firms' information asymmetry problem, firms' cost of capital and cash flow, and firms' production efficiency. The study looks at both the financial industry and non-financial industry and the chapters are arranged in the following order. Firstly, empirical studies show that investors do not value BHCs' pursuit of non-interest income generating activities and yet these activities have demonstrated a dramatic pace of growth in the recent decades. An interesting question is what factors drive the discontent of the investors with the diversification endeavors of the BHCs in non-interest income activities. The first chapter examines the subject from the view point of information opaqueness, which is unique in the banking industry in terms of its intensity. We propose that increased diversification into non-interest income activities deepens information asymmetry, making BHCs more opaque and curtailing their value, as a result. Two important results are obtained in support of this proposition. First, analysts' forecasts are less accurate and more dispersed for the BHCs with greater diversity of non-interest income activities, indicating that information asymmetry problem is more severe for these BHCs. Second, stock market reactions to earning announcements by these BHCs signaling new information to the market are larger, indicating that more information is revealed to the market by each announcement. These findings indicate that increased diversity of non-interest income activities is associated with more severe information asymmetry between insiders and outsiders and, hence, a lower valuation by shareholder. Secondly, since Lang and Stulz (1994) and Berger and Ofek (1995), corporate literature has taken the position that industrial diversification is associated with a firm value discount. However, the validity and the sources of the diversification discount are still highly debated. In particular, extant studies limit themselves to cash flow effects, totally overlooking the cost of capital as a factor determining firm value. Inspired by Lamont and Polk (2001), the second chapter examines how industrial and international diversification change the conglomerates' cost of capital (equity and debt), and thereby the firm value. Our empirical results, based on a sample of Russell 3000 firms over the 1998-2004 period, show that industrial (international) diversification is associated with a lower (higher) firm cost of capital. These findings also hold for firms fully financed with equity. In addition, international diversification is found to be associated with a lower operating cash flow while industrial diversification doesn't alter it. These results indicate that industrial (international) diversification is associated with firm value enhancement (destruction). Given the fact that the majority of the firms involved in industrial diversification also diversify internationally, failing to separate these two dimensions of diversification may result in mistakenly attributing the diversification discount to industrial diversification. Thirdly, financial conglomerates have been increasingly diversifying their business into banking, securities, and insurance activities, especially after the Gramm-Leach-Bliley Act (GLBA, 1999). The third chapter examines whether bank holding company (BHC) diversification is associated with improvement in production efficiency. By applying the data envelopment analysis (DEA), the Malmquist Index of productivity, and total factor productivity change as a decomposed factor of the index, are calculated for a sample of BHCs over the period 1997-2007. The following results are obtained. First, technical efficiency is negatively associated with activity diversification and the effect is primarily driven by BHCs that did not diversify through Section 20 subsidiaries before GLBA. Second, the degree of change in diversification over time does not affect the total factor productivity change but is negatively associated with technical efficiency change over time. This latter effect is also primarily shown on BHCs that did not have Section 20 subsidiaries before GLBA. Therefore, it can be concluded that diversification is on average associated with lower production efficiency of BHCs, especially those BHCs without first-mover advantage obtained through Section 20 subsidiaries. These chapters explores the possible channels through which diversification could alter firms' valuation. They contribute to the literature by offering further knowledge about the effect of diversification.
Temple University--Theses
APA, Harvard, Vancouver, ISO, and other styles
31

SZTUTMAN, ANDRE MEDEIROS. "INFORMATIONALLY EFFICIENT MARKETS UNDER RATIONAL INATTENTION." PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO, 2017. http://www.maxwell.vrac.puc-rio.br/Busca_etds.php?strSecao=resultado&nrSeq=31787@1.

Full text
Abstract:
PONTIFÍCIA UNIVERSIDADE CATÓLICA DO RIO DE JANEIRO
CONSELHO NACIONAL DE DESENVOLVIMENTO CIENTÍFICO E TECNOLÓGICO
FUNDAÇÃO DE APOIO À PESQUISA DO ESTADO DO RIO DE JANEIRO
BOLSA NOTA 10
Propomos uma nova solução para o paradoxo de Grossman Stiglitz [1980]. Trocando sua estrutura informacional por uma restrição de desatenção racional, nós mostramos que os preços podem refletir toda a informação disponível, sem quebrar os incentivos dos participantes do mercado em processar informação. Esse modelo reformula a hipótese dos mercados eficientes e concilia visões opostas: preços são completamente reveladores, mas apenas para aqueles que são suficientemente espertos. Finalmente, nós desenvolvemos um método para postular e resolver modelos de equilíbrio geral Walrasiano que circunscreve hipóteses simplificadoras anteriores.
We propose a new solution for the Grossman and Stiglitz [1980] paradox. By substituting a rational inattention restriction for their information structure, we show that prices can reflect all the information available without breaking the incentives of market participants to gather information. This model reframes the efficient market hypothesis and reconciles opposing views: prices are fully revealing but only for those who are sufficiently smart. Finally, we develop a method for postulating and solving Walrasian general equilibrium models with rationally inattentive agents circumventing previous tractability assumptions.
APA, Harvard, Vancouver, ISO, and other styles
32

Daoud, Amjad M. "Efficient data structures for information retrieval." Diss., Virginia Tech, 1993. http://hdl.handle.net/10919/40031.

Full text
APA, Harvard, Vancouver, ISO, and other styles
33

Chli, Margarita. "Applying information theory to efficient SLAM." Thesis, Imperial College London, 2010. http://hdl.handle.net/10044/1/5634.

Full text
Abstract:
The problem of autonomous navigation of a mobile device is at the heart of the more general issue of spatial awareness and is now a well-studied problem in the robotics community. Following a plethora of approaches throughout the history of this research, recently, implementations have been converging towards vision-based methods. While the primary reason for this success is the enormous amount of information content encrypted in images, this is also the main obstacle in achieving faster and better solutions. The growing demand for high-performance systems able to run on affordable hardware pushes algorithms to the limits, imposing the need for more effective approximations within the estimation process. The biggest challenge lies in achieving a balance between two competing goals: the optimisation of time complexity and the preservation of the desired precision levels. The key is in agile manipulation of data, which is the main idea explored in this thesis. Exploiting the power of probabilistic priors in sequential tracking, we conduct a theoretical investigation of the information encoded in measurements and estimates, which provides a deep understanding of the map structure as perceived through the camera lens. Employing information theoretic principles to guide the decisions made throughout the estimation process we demonstrate how this methodology can boost both the efficiency and consistency of algorithms. Focusing on the most challenging processes in a state of the art system, we apply our information theoretic framework to local motion estimation and maintenance of large probabilistic maps. Our investigation gives rise to dynamic algorithms for quality map-partitioning and robust feature mapping in the presence of significant ambiguity and variable camera dynamics. The latter is further explored to achieve scalable performance allowing dense feature matching based on concrete probabilistic decisions.
APA, Harvard, Vancouver, ISO, and other styles
34

Brennan, Alan. "Efficient Computation of Value of Information." Thesis, University of Sheffield, 2007. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.487609.

Full text
Abstract:
This thesis is concerned with computation of expected value of information (EVI). The topic is important because EVI methodology is a rational, coherent framework for prioritising and planning the design of biomedical and clinical research studies that represent an enormous expenditure world-wide. At the start of my research few studies existed. During the course of the PhD, my own work and that of other colleagues has been published and the uptake of the developing methods is increasing. The thesis contains a review of early literature as well as of the the emerging studies over the 5 years since my first work was done in 2002 (Chapter 2). Methods to compute partial expected value of perfect information are developed and tested in illustrative cost-utility decision models with non-linear net benefit functions and correlated parameters. Evaluation using nested Monte Carlo simulations is investigated and the number of inner and outer simulations required is explored (Chapter 3). The computation of expected value of sample information using nested Monte Carlo simulations combined with Bayesian updating of model parameters with conjugate distributions given simulated data is examined (Chapter 4). In Chapter 5, the development of a novel Bayesian approximation for posterior expectations is given and this is applied and tested in the computation of EVSI for an illustrative model again with normally distributed parameters. The application is further extended to a non-conjugate proportional hazards Weibull distribution, a common circumstance for clinical trial concerned with survival or time to event data (Chapter 6). The application of the Bayesian approximation in the Weibull model is then tested against 4 other methods for estimating the Bayesian up- dated Weibull parameters including the computationally intensive Markov Chain Monte Carlo (MCMC) approach which could be considered the gold standard (Chapter 7). The result of the methodological developments in this thesis and the testing on case studies is that some new approaches to computing EVI are now a;vailable. In many models this will improve the efficiency of computation, making possible EVI calculations in some previously infeasible circumstances. In Chapter 8, I summarise the achievements made in this work, how they relate to that of other scholars and the research agenda which still faces us. I conclude with the firm hope that EVI methods will begin to provide decision makers with clearer support when deciding on investments in further research.
APA, Harvard, Vancouver, ISO, and other styles
35

Ableiter, Dirk. "Smart caching for efficient information sharing in distributed information systems." Thesis, Monterey, Calif. : Naval Postgraduate School, 2008. http://edocs.nps.edu/npspubs/scholarly/theses/2008/Sept/08Sep%5FAbleiter.pdf.

Full text
Abstract:
Thesis (M.S. in Computer Science)--Naval Postgraduate School, September 2008.
Thesis Advisor(s): Singh, Gurminder ; Otani, Thomas. "September 2008." Description based on title screen as viewed on October 31, 2008. Includes bibliographical references (p. 45-50). Also available in print.
APA, Harvard, Vancouver, ISO, and other styles
36

Sparti, Steve. "Payback information : it's effect on home buyers regarding energy efficiency /." Diss., CLICK HERE for online access, 2006. http://contentdm.lib.byu.edu/ETD/image/etd1343.pdf.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Novak, Jiri. "On the Importance of Accounting Information for Stock Market Efficiency." Doctoral thesis, Uppsala : Företagsekonomiska institutionen, Uppsala universitet, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-8411.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Duxbury, Darren. "Market efficiency and the role of information : an experimental analysis." Thesis, University of Leeds, 1998. http://etheses.whiterose.ac.uk/2777/.

Full text
Abstract:
The purpose of this research is to gain additional insight concerning the highly efficient market outcomes generated under the rules of trade of the double auction, in which traders have the dual role of both buyer and seller and can simultaneously call out offers to buy and sell. It is conjectured that the experimental literature’s robust results detailing the efficiency of the double auction institution may be a product of the constant and known duration of trade incorporated in previous experimental designs. In one of the few relevant theoretical discussions Friedman (1984, p.71) suggests that the predetermined, known time at which trade will cease is one of a number of institutional features of experimental double auction markets that enhance the efficiency of observed market outcomes. Known trading duration may well be a key variable in the determination of the price formation process and the convergence to competitive equilibrium in the double auction institution. This study extends previous work by conducting a series of experiments designed to determine the importance of trading duration on the convergence tendencies of experimental asset markets governed by the rules of the double auction institution. The issue is of substantive theoretical and practical interest. The results of this study offer a number of conclusions. Aggregated across the eighteen experimental asset markets studied, transaction prices tend to exhibit convergence to competitive outcomes. Importantly, the etlect of known period duration on observed market behaviour is significant. Experimental asset markets that incorporate uncertain trading durations display more aggressive trading strategies. This is evidenced by an increase in the rate of trade relative to markets where the duration of trade varies but is known. The markets with uncertain trading durations also exhibit reduced levels of market efficiency relative to the other markets studied. The implication is clear, any future refinement of either theoretical models or institutions of exchange must explicitly recognise the effect of uncertain trading duration on market behaviour in double auctions.
APA, Harvard, Vancouver, ISO, and other styles
39

Sparti, Steven E. "Payback Information: It's Effect on Home Buyers Regarding Energy Efficiency." BYU ScholarsArchive, 2006. https://scholarsarchive.byu.edu/etd/470.

Full text
Abstract:
This study was conducted to find out how payback analysis would affect consumer decision making with regards to home energy efficient upgrade packages. Three different home plans were obtained from a local builder and seven different energy efficient packages were created. Using Hot2000 the heating and cooling loads were calculated for each building, with each energy efficient package, in each of the four major cardinal directions. The averages were taken and the payback information was calculated. The payback information included the increased cost of the package, the increase in the mortgage payment, the annual savings from heating and cooling bills, the monthly savings, the positive or negative monthly cash flow, the amount of time and interest saved if the monthly savings were added to the mortgage principle, the number of years required to pay back the original investment, the rate of return and the increased home value. A survey was taken to see how the subjects would react to viewing the payback information. The subjects were individuals looking to buy a home in the next 12 months somewhere along the Wasatch Front area in Utah. Depending on the size of the home the subjects were looking for, the subjects were shown the different packages with their accompanying cost increase and how that would affect the subjects monthly mortgage payment. The subjects then chose the package they would want for their home, based on their knowledge of construction materials, the additional cost, and how it would affect their mortgage. They were then shown the payback information for the home that was chosen and asked if they would change their mind concerning the previous decision. They were then asked what parts of the payback information they found to be most useful. This study shows that payback information is indeed useful and would help builders to attract new customers, increase profits, and provide customers with powerful information that will empower them to make better decisions about home energy efficiency.
APA, Harvard, Vancouver, ISO, and other styles
40

Pereira, Caio Augusto Vigo. "Portfolio efficiency tests with conditioning information using empirical likelihood estimation." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/96/96131/tde-11072016-145134/.

Full text
Abstract:
We evaluate the use of Generalized Empirical Likelihood (GEL) estimators in portfolios efficiency tests for asset pricing models in the presence of conditional information. Estimators from GEL family presents some optimal statistical properties, such as robustness to misspecification and better properties in finite samples. Unlike GMM, the bias for GEL estimators do not increase as more moment conditions are included, which is expected in conditional efficiency analysis. We found some evidences that estimators from GEL class really performs differently in small samples, where efficiency tests using GEL generate lower estimates compared to tests using the standard approach with GMM. With Monte Carlo experiments we see that GEL has better performance when distortions are present in data, especially under heavy tails and Gaussian shocks.
Neste estudo avaliamos o uso de estimadores Generalized Empirical Likelihood (GEL) em testes de eficiência de portfólios para modelos apreçamento de ativos na presença de informação condicional. Estimadores da família GEL apresentam algumas propriedades estatísticas ótimas, tais como robustez à má especificação e melhores propriedades em amostras finitas. Diferentemente do GMM, o viés dos estimadores GEL não aumenta conforme se incluem mais condições de momentos, o que é esperado na análise de eficiência condicional. Encontramos algumas evidências de que os estimadores da classe GEL realmente performam diferentemente em amostras finitas, em que testes de eficiência com o uso do GEL geram estimativas menores comparadas aos testes com o uso da abordagem padrão com GMM. Através dos experimentos de Monte Carlo vemos que o GEL possui melhor performance quando distorções estão presentes nos dados, especialmente sob heavy tails e choques Gaussianos.
APA, Harvard, Vancouver, ISO, and other styles
41

Costello, Greg. "Price discovery and information diffusion in the Perth housing market 1988-2000." UWA Business School, 2004. http://theses.library.uwa.edu.au/adt-WU2005.0034.

Full text
Abstract:
[Truncated abstract] This thesis examines informational efficiency and price discovery processes within the Perth housing market for the period 1988-2000 by utilising a rich source of Western Australian Valuer General’s Office (VGO) data. Fama’s (1970) classification of market efficiency as potentially weak form, semi-strong, or strong form has been a dominant paradigm in tests of market efficiency in many asset markets. While there are some parallels, the results of tests in this thesis suggest there are also limitations in applying this paradigm to housing markets. The institutional structure of housing markets dictates that a deeper recognition of important housing market characteristics is required. Efficiency in housing markets is desirable in that if prices provide accurate signals for purchase or disposition of real estate assets this will facilitate the correct allocation of scarce financial resources for housing services. The theory of efficient markets suggests that it is desirable for information diffusion processes in a large aggregate housing market to facilitate price corrections. In an efficient housing market, these processes can be observed and will enable housing units to be exchanged with an absence of market failure in all price and location segments. Throughout this thesis there is an emphasis on disaggregation of the Perth housing market both by price and location criteria. Results indicate that the Perth housing market is characterised by varying levels of informational inefficiency in both price and location segments and there are some important pricing-size influences.
APA, Harvard, Vancouver, ISO, and other styles
42

Zhang, Xiang. "Efficiency in Emergency medical service system : An analysis on information flow." Thesis, Växjö University, School of Mathematics and Systems Engineering, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:vxu:diva-1620.

Full text
Abstract:

In an information system which includes plenty of information services, we are always seeking a solution to enhance efficiency and reusability. Emergency medical service system is a classic information system using application integration in which the requirement of information flow transmissions is extremely necessary. We should always ensure this system is running in best condition with highest efficiency and reusability since the efficiency in the system directly affects human life.

The aim of this thesis is to analysis emergency medical system in both qualitative and quantitative ways. Another aim of this thesis is to suggest a method to judge the information flow through the analysis for the system efficiency and the correlations between information flow traffic and system applications.

The result is that system is a main platform integrated five information services. Each of them provides different unattached functions while they are all based on unified information resources. The system efficiency can be judged by a method called Performance Evaluation, the correlation can be judged by multi-factorial analysis of variance method.

APA, Harvard, Vancouver, ISO, and other styles
43

Kitchin, Patricia Lee III. "A New Method for Comparing Experiments and Measuring Information." Diss., Virginia Tech, 1997. http://hdl.handle.net/10919/30758.

Full text
Abstract:
A statistic that summarizes an entire data set without losing any information about the family of distributions or the model is often called a sufficient statistic. Generally, one would like to use the statistic that contains the most information about the parameter space. Sometimes there are several sufficient statistics. At other times the only sufficient statistic is the entire data set. A large data set can be difficult to work with. In this case, can one use a statistic that, though not sufficient, does summarize the data set somewhat? How much information would be lost? How can one compare two statistics that aren't sufficient in terms of the amount of information each provides? A new method for comparing experiments and measuring information is introduced. No assumptions are made and no conditions are required in order for this new method to measure the amount of information contained in almost any statistic. Several properties of this new method are discussed and a new characterization of sufficiency based on this new method is presented. The new method is used to evaluate the expected efficiency of a statistic in discriminating between any two values of the parameter as compared to a sufficient statistic. This new method can be self-calibrated to give this expected efficiency a meaningful scale. It is shown that this new method has some advantages over existing methods of measuring information. This new method is applied to Casino Blackjack. Several card-counting statistics are compared by the amount of information each provides in discriminating between different deck compositions as compared to a sufficient statistic. This new method provides new insight about information in card-counting statistics by putting this information on a meaningful scale.
Ph. D.
APA, Harvard, Vancouver, ISO, and other styles
44

Bartelmess, Johan. "Compression efficiency of different picture coding structures in High Efficiency Video Coding (HEVC)." Thesis, Uppsala universitet, Avdelningen för systemteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-281926.

Full text
Abstract:
Video content is expected to account for 80 percent of all Internet traffic in 2019. There is therefore an increasing need for better video compression methods, to decrease the use of internet bandwidth. One way of achieving high video compression is to predict pixel values for a video frame based on prior and succeeding pictures in the video. The H.265 video compression standard supports this method, and in particular makes it possible to specify in which order pictures are coded, and which pictures are predicted from which. The coding order is specified for Groups Of Pictures (GOP), where a number of pictures are grouped together and predicted from each other in a specified order. This thesis evaluates how the GOPs should be structured, for instance in terms of sizing, to maximize the compression efficiency relative to the video quality. It also investigates the effect of multiple reference pictures, a functionality that enables the picture that renders the best prediction to be selected. The results show that the largest tested GOP size of 32 pictures is preferable for all tested video characteristics, and that support for multiple reference pictures renders a similar increase in compression efficiency for all GOP sizes.
APA, Harvard, Vancouver, ISO, and other styles
45

Ramirez, de Arellano Serna Antonio. "Incorporating preference information in Data Envelopment Analysis via external restrictions." Thesis, University of York, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.367467.

Full text
APA, Harvard, Vancouver, ISO, and other styles
46

Gustafsson, Anders, and Peter Wulff. "EFFICIENT HANDOVER OF INFORMATION IN SUPPLY CHAINS." Thesis, Växjö University, School of Mathematics and Systems Engineering, 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:vxu:diva-2334.

Full text
Abstract:

Absract: Information logistics is a question about moving information between different individuals or

systems to that place in time and space where a demand for the information arise. It is utilized

trough different sources of bearer that is distributed trough different channels in an optimal way.

How to make it without distorting the information in the interface between different participants?

This report is exactly about that.

The report has resulted in the Information Quality Model (IQM) and the Information Quality

Process (IQP).

IQM is a model for evaluation of information that explains which characteristics that affects the

efficiency of information. IQM is built on Aspects - Characteristics – Questions.

If we for example have right information and we can deliver to right place in right time but with a

poor layout the efficiency still can be low. This leads to that we can state that all the stated aspects

with its belonging characteristics must be paid attention to get a good quality in the in the

interfaces.

IQP Is a process showing how to handle and build up processes for collecting and processing

of information with a method that will secure the quality.

The method is built on an analyze phase where frames are created first for what should be collected

and how it should be done and an executing phase where information is filled in, in accordance

to to the rules that was set up in the analyze phase.

IQM and IQP form a solution that together illuminates the whole process of the information delivery

in a unique way.

The research method is mainly based on cases built on the authors empirical experiences.


Sammanfattning: Informationslogistik handlar om att flytta information mellan olika individer eller system till den

plats i tid och rum som informationsbehovet uppstår. Detta sker genom att man utnyttjar olika

former av bärare som distribueras via olika kanaler på ett optimalt sätt.

Hur gör man detta utan att information förvanskas i gränssnitten mellan olika aktörer?

Just detta handlar detta X-jobb om.

X-jobbet har resulterat i The Information Quality Model (IQM) och The Information Quality Process

(IQP).

IQM är en informationsvärderingsmodell som förklarar vilka egenskaper som påverkar verkningsgraden

hos information. IQM bygger på Aspekter – Egenskaper – Frågor.

Om exempelvis vi har rätt information och vi kan leverera till rätt plats i rätt tid men i en usel

layout så kan verkningsgraden bli låg. Detta leder till att vi kan konstatera att alla redovisade

aspekter med tillhörande egenskaper måste beaktas för att få en bra kvalitet i gränssnitten.

IQP är en process som visar hur man hanterar och bygger upp processer för insamling och

bearbetning av information med en kvalitetssäkrande metod.

Metoden bygger på en analysfas där man skapar ramar för vad som skall samlas in och hur det

skall ske samt en utförande fas där man fyller på med information uppställd enligt de regler

man satt i analysfasen.

IQM och IQP bildar tillsammans en helhet som belyser hela processen av informationsöverlämning

på ett unikt sätt.

Forskningsmetoden är i huvudsak case baserad och bygger på författarnas empiriska erfarenheter.

APA, Harvard, Vancouver, ISO, and other styles
47

Bako, Boto [Verfasser]. "Efficient information dissemination in VANETs / Boto Bako." Ulm : Universität Ulm, 2016. http://d-nb.info/1122195583/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Shin, Jungmin. "Data transform composition for efficient information integration." [Gainesville, Fla.] : University of Florida, 2009. http://purl.fcla.edu/fcla/etd/UFE0024907.

Full text
APA, Harvard, Vancouver, ISO, and other styles
49

PONNA, CHAITANYA, and BODEPUDI RAKESH CHOWDARY. "HOW ENABLE EFFICIENT INFORMATION INTERCHANGES IN VIRTUAL." Thesis, Högskolan i Borås, Institutionen Handels- och IT-högskolan, 2011. http://urn.kb.se/resolve?urn=urn:nbn:se:hb:diva-20865.

Full text
Abstract:
The Internet is a collection of computer networks; it is the most important networkingenvironment in the world, it used for information interchanges in virtual network. Wirelesstechnologies, such as Wi-Fi, WiMAX etc, can suggest more suitable and cooler way forinformation interchanges in virtual network.The most well-known ongoing wireless city projects are counting Wireless Philadelphia, GoogleWi-Fi Mountain View, Wireless Taipei City, and San Francisco Tech Connect project. Web haslimits of interactivity and presentation. The Web’s client-server architecture blocks informationexchange. Furthermore, most Web applications are only intended for conservative computers,not for mobile handheld devices. In period of information exchange on the Web, the new Web2.0 is suggested. Web 2.0 refers to a perceived or planned second generation of Internet-basedservices, such as public networking sites, wikis, communication tools, which highlight onlineteamwork and sharing among users.A virtual network or online community is a collection of people that may or may not chiefly ororiginally connect or interact via the Internet. Virtual network have also become an additionalform of communication amid people who know each other in actual life. Today, virtual networkcan be used insecurely for a diversity of social collections interacting via the Internet. It does notunavoidably mean that there is a solid bond between the members.The validation methods like internal validity, external validity and Reliability for this researchand how it affects these methods for our research. We also use interview method has beenconsidered for this research. We will use diagrams, models, prototypes, and texts, to illuminate our result. We will define allthe diagrams and models and prototypes in my own text. We will give the reference of theoriginal data collected from various sources where as from Internet, websites, books andJournals.
Program: Magisterutbildning i informatik
APA, Harvard, Vancouver, ISO, and other styles
50

Caria, Antonio Stefano. "Efficiency and other-regarding preferences in information and job-referral networks." Thesis, University of Oxford, 2015. http://ora.ox.ac.uk/objects/uuid:4c243348-af82-4cdc-b402-e75997e4a599.

Full text
Abstract:
In this thesis I study how networks are formed and I analyse the strategies that well-connected individuals adopt in public good games on a network. In chapter one I study an artefactual field experiment in rural India which tests whether farmers can create efficient networks in a repeated link formation game, and whether group categorisation increases the frequency of in-group links and reduces network efficiency. I find that the efficiency of the networks formed in the experiment is significantly lower than the efficiency which could be achieved under selfish, rational play. When information about group membership is disclosed, in-group links are chosen more frequently, while the efficiency of network structure is not significantly affected. Using a job-referral network experiment in an urban area of Ethiopia, I investigate in chapter two whether individuals create new links with the least connected players in the network. In a first treatment, competition for job-referrals makes it in the player's interest to link with the least connected partners. In this treatment, links to the least connected players are significantly more likely than links to better connected individuals. In a second treatment, connections only affect the welfare of the new partner. Choosing the least connected player minimises inequality and maximises aggregate efficiency. This may motivate other-regarding players. In this treatment, however, links to least connected partners are not significantly more likely than links to other players. In chapter three I explore the characteristics that individuals value in the people they approach for advice. Using cross-sectional data on cocoa farmers in Ghanaian villages and a matched lottery experiment, I find an association between the difference in the aversion to risk of two farmers and the probability that one farmer is interested in the advice of the other farmer. In chapter four I study a one-shot public good game in rural India between farmers connected by a star network. Contributions by the centre of the star have a larger impact on aggregate payoffs than contributions by the spoke players. I use the strategy method to study whether the centre of the star contributes more than the average of the spokes. In selected sessions, I disclose participants' expectations about the choices of the centre of star. I find that the centre player contributes just as much as the average of the spokes, and that he is influenced by the expectations that other players hold about his decisions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography