To see the other types of publications on this topic, follow the link: Bayesian econometrics.

Dissertations / Theses on the topic 'Bayesian econometrics'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 dissertations / theses for your research on the topic 'Bayesian econometrics.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.

1

KIM, DONG-HYUK. "Bayesian Econometrics for Auction Models." Diss., The University of Arizona, 2010. http://hdl.handle.net/10150/193663.

Full text
Abstract:
This dissertation develops Bayesian methods to analyze data from auctions and produce policy recommendations for auction design. The essay, "Auction Design Using Bayesian Methods," proposes a decision theoretic method to choose a reserve price in an auction using data from past auctions. Our method formally incorporates parameter uncertainty and the payoff structure into the decision procedure. When the sample size is modest, it produces higher expected revenue than the plug-in methods. Monte Carlo evidence for this is provided. The second essay, "Flexible Bayesian Analysis of First Price Auctions Using Simulated Likelihood," develops an empirical framework that fully exploits all the shape restrictions arising from economic theory: bidding monotonicity and density affiliation. We directly model the valuation density so that bidding monotonicity is automatically satisfied, and restrict the parameter space to rule out all the nonaffiliated densities. Our method uses a simulated likelihood to allow for a very exible specification, but the posterior analysis is exact for the chosen likelihood. Our method controls the smoothness and tail behavior of the valuation density and provides a decision theoretic framework for auction design. We reanalyze a dataset of auctions for drilling rights in the Outer Continental Shelf that has been widely used in past studies. Our approach gives significantly different policy prescriptions on the choice of reserve price than previous methods, suggesting the importance of the theoretical shape restrictions. Lastly, in the essay, "Simple Approximation Methods for Bayesian Auction Design," we propose simple approximation methods for Bayesian decision making in auction design problems. Asymptotic posterior distributions replace the true posteriors in the Bayesian decision framework, which are typically a Gaussian model (second price auction) or a shifted exponential model (first price auction). Our method first approximates the posterior payoff using the limiting models and then maximizes the approximate posterior payoff. Both the approximate and exact Bayes rules converge to the true revenue maximizing reserve price under certain conditions. Monte Carlo studies show that my method closely approximates the exact procedure even for fairly small samples.
APA, Harvard, Vancouver, ISO, and other styles
2

Kalli, Maria. "Bayesian Nonparametrics and Applications in Financial Econometrics." Thesis, University of Kent, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.499786.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Cornwall, Gary J. "Three Essays on Bayesian Econometric Methods." University of Cincinnati / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=ucin1504801632767553.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Santos, Fernando Genta dos. "Ensaios sobre macroeconometria bayesiana aplicada." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/12/12138/tde-04042012-201945/.

Full text
Abstract:
Os três artigos que compõe esta Tese possuem em comum a utilização de técnicas macroeconométricas bayesianas, aplicadas a modelos dinâmicos e estocásticos de equilíbrio geral, para a investigação de problemas específicos. Desta forma, esta Tese busca preencher importantes lacunas presentes na literatura nacional e internacional. No primeiro artigo, estimou-se a importância do canal de custo da política monetária por meio de um modelo novo-keynesiano dinâmico e estocástico de equilíbrio geral. Para tanto, alteramos o modelo convencional, assumindo que uma parcela das firmas precise contrair empréstimos para pagar sua folha salarial. Desta forma, a elevação da taxa nominal de juro impacta positivamente o custo unitário do trabalho efetivo, podendo acarretar em aumento da inflação. Este artigo analisa as condições necessárias para que o modelo gere esta resposta positiva da inflação ao aperto monetário, fenômeno esse que ficou conhecido como price puzzle. Devido ao uso da metodologia DSGE-VAR, os resultados aqui encontrados podem ser comparados tanto com a literatura que trata o puzzle como um problema de identificação dos modelos VAR como com a literatura que avalia o canal de custo por meio de modelos novo-keynesianos. No segundo artigo, avaliamos até que ponto as expectativas de inflação geradas por um modelo dinâmico e estocástico de equilíbrio geral são compatíveis com as expectativas coletadas pelo Banco Central do Brasil (BCB). Este procedimento nos permite analisar a racionalidade das expectativas dos agentes econômicos brasileiros, comparando-as não à inflação observada, mas sim à projeção de um modelo desenvolvido com a hipótese de expectativas racionais. Além disso, analisamos os impactos do uso das expectativas coletadas pelo BCB na estimação do nosso modelo, no que se refere aos parâmetros estruturais, função de resposta ao impulso e análise de decomposição da variância. Por fim, no terceiro artigo desta Tese, modificamos o modelo novo-keynesiano convencional, de forma a incluir a teoria do desemprego proposta pelo economista Jordi Galí. Com isso, procuramos preencher uma lacuna importante na literatura nacional, dominada por modelos que não contemplam a possibilidade de desequilíbrios no mercado de trabalho capazes de gerar desemprego involuntário. A interpretação alternativa do mercado de trabalho aqui utilizada permite superar os problemas de identificação notoriamente presentes na literatura, tornando o modelo resultante mais robusto. Desta forma, utilizamos o modelo resultante para, dentre outras coisas, avaliar os determinantes da taxa de desemprego ao longo da última década.
The three articles that comprise this thesis have in common the use of macroeconometric bayesian techniques, applied to dynamic stochastic general equilibrium models, for the investigation of specific problems. Thus, this thesis seeks to fill important gaps present in the national and international literatures. In the first article, I estimated the importance of the cost-push channel of monetary policy through a new keynesian dynamic stochastic general equilibrium model. To this end, we changed the conventional model, assuming now that a share of firms needs to borrow to pay its payroll. Thus, an increase in the nominal interest rate positively impacts the effective unit labor cost and may result in an inflation hike. This article analyzes the necessary conditions for the model to exhibit a positive response of inflation to a monetary tightening, a phenomenon that became known as the price puzzle. Because I use the DSGE-VAR methodology, the present results can be compared both with the empirical literature dealing with the puzzle as an identification problem of VAR models and with the theoretical literature that evaluates the cost-push channel through new keynesian models. In the second article, we assess the extent to which inflation expectations generated by a dynamic stochastic general equilibrium model are consistent with expectations compiled by the Central Bank of Brazil (BCB). This procedure allows us to analyze the rationality of economic agents\' expectations in Brazil, comparing them not with the observed inflation, but with the forecasts of a model developed with the hypothesis of rational expectations. In addition, we analyze the impacts of using expectations compiled by the BCB in the estimation of our model, looking at the structural parameters, the impulse response function and variance decomposition analysis. Finally, the third article in this thesis, I modified the conventional new keynesian model, to include unemployment as proposed by the economist Jordi Galí. With that, I fill an important gap in the national literature, dominated by models that do not contemplate the possibility of disequilibrium in the labor market that can generate involuntary unemployment. The alternative interpretation of the labor market used here overcomes the identification problems notoriously present in the literature, making the resulting model more robust to the Lucas critique. Thus, I use the resulting model to assess the determinants of the unemployment rate over the last decade, among other points.
APA, Harvard, Vancouver, ISO, and other styles
5

Wang, Jiahui. "Three essays on econometrics /." Thesis, Connect to this title online; UW restricted, 1997. http://hdl.handle.net/1773/7477.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Fantinatti, Marcos da Costa. "Modelo de equilíbrio geral estocástico e o mercado de trabalho brasileiro." Universidade de São Paulo, 2016. http://www.teses.usp.br/teses/disponiveis/12/12138/tde-25022016-112933/.

Full text
Abstract:
Os três artigos desta tese focam no mercado de trabalho. O primeiro artigo calculou a probabilidade com que um trabalhador deixa o emprego e a probabilidade com que um desempregado encontra trabalho no Brasil. A metodologia utilizada foi a desenvolvida por Shimer (2012). O objetivo foi determinar qual destes dois fatores seria o mais importante para explicar as flutuações da taxa de desemprego no Brasil. Os resultados mostraram que é a dinâmica da probabilidade com que um desempregado encontra emprego que explica o comportamento da taxa de desemprego. Este resultado é distinto daquele encontrado normalmente na literatura. No segundo artigo, log-linearizamos e estimados o modelo de Christiano, Eichenbaum e Trabandt (2013) para o Brasil. Este modelo difere dos modelos novos keynesianos tradicionais ao colocar uma estrutura de searching (busca) para o mercado de trabalho. A ideia foi comparar este modelo com o modelo de rigidez de preços e salários tradicional e analisar se esta estrutura para o mercado de trabalho é capaz de fazer o papel das rigidezes tradicionais, no que se refere a propagação dos choques na economia. As funções impulso resposta a um choque contracionista de política monetária mostraram que o modelo explicou o comportamento esperado para variáveis como PIB, inflação e taxa de desemprego. Ainda, a estimação do modelo mostrou, no geral, que os preços no Brasil são reajustados com uma frequência menor do que a frequência indicada pelos modelos novos keynesianos com rigidez de preços e salários. Por sua vez, ao desligar a rigidez da utilização do capital e a do working capital channel, este modelo mais completo, maior e mais detalhado para mercado de trabalho pareceu não ser capaz de dar conta do movimento inercial e persistente observado para as variáveis macroeconômicas como PIB e inflação. Por fim, no terceiro artigo, estimamos novamente o modelo Christiano, Eichenbaum e Trabandt (2013), mas agora para os Estados Unidos. Entretanto, adotamos uma estratégia de estimação diferente: optamos por primeiro log-linearizar o modelo para depois fazer a estimação, para dois períodos: até 2008, assim como no artigo original, e até 2014. O objetivo principal foi comparar os resultados da nossa estimativa com os resultados de Christiano, Eichenbaum e Trabandt (2013). Para o conjunto de dados até 2008, os resultados indicam que os valores estimados estão em linha com os encontrados na literatura e, no geral, não estão muito distantes das estimações do artigo original. Mas, os parâmetros estimados apontaram para um modelo com um pouco mais de rigidez de preços, uma maior persistência de consumo e com uma regra de política monetária um pouco menos inercial em relação à do artigo original. Entretanto, esta regra mostrou uma reação muito maior à inflação do que ao produto, assim como em Christiano, Eichenbaum e Trabandt (2013). Considerando a amostra toda, isto é, até o final de 2014, observamos que o modelo estimado continuou a ter uma maior rigidez de preço em relação ao modelo original e uma regra de política monetária menos inercial. Além disso, os dados mais recentes afetaram de modo mais expressivo os valores estimados para variáveis do mercado de trabalho. Por sua vez, as funções impulso resposta refletiram esta menor inércia da política monetária e, no geral, apresentaram as trajetórias esperadas.
The three articles of this thesis focus on the labor market. The first article calculated the probability of a worker leaving his job and the probability of an unemployed person finding a job in Brazil, using the methodology developed by Shimer (2012). The aim was to determine which of these factors was the most important to explain the unemployment rate fluctuations. The results showed that the probability of an unemployed worker finding a job is more important to explain the dynamic of the unemployment rate. Commonly, the literature has found an opposite result in Brazil. In the second article, we log linearized and estimated the model built by Christiano, Eichenbaum and Evans (2013) for Brazil. This model is different from the traditional New Keynesian models because it has a structure of searching in the labor market. The idea was to compare this model with the traditional one with sticky wage and sticky prices. Moreover, the idea was to analyze if this model with searching structure in the labor market was able to substitute some traditional rigidity when the concern is the propagation of shocks. The impulse response functions to a contractionist monetary policy shock showed that this model explains the dynamic that is normally found in GDP, inflation and unemployment rate. Furthermore, the estimation showed that, in general, the prices are readjusted less frequently than the frequency estimated by New Keynesian models with sticky wage and sticky prices. Besides, when the rigidities (capital utilization and working capital channel) are eliminated, this model did not properly explain the inertial and persistence dynamic of the macroeconomics variables, such as GDP and inflation. Finally, in the last article, we estimated the Christiano, Eichenbaum and Trabandt (2013) model for the United States, but we adopted a different estimation strategy. We log linearized the model and estimated it with Bayesian methods. Moreover, we estimated for two different periods. The aim was to compare our results with the original model. When the model was estimated with data up to 2008, the results showed that the estimations were in line with the values found in the literature and, in general, they were not too far from the values estimated in the original article. However, the parameters estimated showed a model in which the prices are more rigid, the consumption habit is higher and the monetary rule is less inertial than observed in the original model. However, the monetary authority reacted much more to inflation than GDP, as it happened in the original article. When we considered the data until 2014, we observed that the estimated model remained with more sticky prices and a more inertial monetary rule. Moreover, we noted that this more recent data affected more expressively the estimated values of the labor market. The analysis of impulse response function showed this less inertial dynamic of the monetary rule and, overall, they followed the expected dynamics
APA, Harvard, Vancouver, ISO, and other styles
7

Jarocinski, Marek. "Essays on bayesian and classical econometrics with small samples." Doctoral thesis, Universitat Pompeu Fabra, 2006. http://hdl.handle.net/10803/7339.

Full text
Abstract:
Esta tesis se ocupa de los problemas de la estimación econométrica con muestras pequeñas, en los contextos del los VARs monetarios y de la investigación empírica del crecimiento. Primero, demuestra cómo mejorar el análisis con VAR estructural en presencia de muestra pequeña. El primer capítulo adapta la especificación con prior intercambiable (exchangeable prior) al contexto del VAR y obtiene nuevos resultados sobre la transmisión monetaria en nuevos miembros de la Unión Europea. El segundo capítulo propone un prior sobre las tasas de crecimiento iniciales de las variables modeladas. Este prior resulta en la corrección del sesgo clásico de la muestra pequeña en series temporales y reconcilia puntos de vista Bayesiano y clásico sobre la estimación de modelos de series temporales. El tercer capítulo estudia el efecto del error de medición de la renta nacional sobre resultados empíricos de crecimiento económico, y demuestra que los procedimientos econométricos robustos a incertidumbre acerca del modelo son muy sensibles al error de medición en los datos.
This thesis deals with the problems of econometric estimation with small samples, in the contexts of monetary VARs and growth empirics. First, it shows how to improve structural VAR analysis on short datasets. The first chapter adapts the exchangeable prior specification to the VAR context, and obtains new findings about monetary transmission in New Member States. The second chapter proposes a prior on initial growth rates of modeled variables, which tackles the Classical small-sample bias in time series, and reconciles Bayesian and Classical points of view on time series estimation. The third chapter studies the effect of measurement error in income data on growth empirics, and shows that econometric procedures which are robust to model uncertainty are very sensitive to measurement error of the plausible size and properties.
APA, Harvard, Vancouver, ISO, and other styles
8

Wu, Yue. "Bayesian dynamic covariance models with applications to finance and econometrics." Thesis, University of Cambridge, 2014. https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.708037.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Serrano, Fábio Martins. "Impacto regional da política monetária no Brasil: uma abordagem bayesiana." Universidade de São Paulo, 2014. http://www.teses.usp.br/teses/disponiveis/12/12138/tde-16032015-132813/.

Full text
Abstract:
Esta dissertação tem como objetivo (i) estimar a resposta das Unidades Federativas brasileiras a um choque de política monetária e, caso as mesmas respondam de forma assimétrica, (ii) compreender os determinantes de tal heterogeneidade. Para tanto, faz-se uso de técnicas de econometria Bayesiana, assim como em Francis et al (2012). Tais técnicas visam contornar o problema de dimensionalidade inerente aos modelos que englobam um número elevado de variáveis, além de permitir modelar formalmente a incerteza existente na escolha do conjunto de covariadas adequado. A resposta das Unidades Federativas foi estimada através de um VAR Bayesiano. A evidência encontrada sugere que as Unidades Federativas brasileiras respondem de forma assimétrica às inovações de política monetária. As regiões Sul e Sudeste apresentam as maiores contrações em resposta ao choque, enquanto a região Norte não apresentou sensibilidade. Para o estudo dos determinantes das assimetrias regionais, utilizou-se o Bayesian Model Averaging. Apesar do grande grau de incerteza acerca dos determinantes da heterogeneidade encontrada, Estados com maior porcentagem de empregos provenientes da indústria de transformação tendem a ser mais sensíveis às variações exógenas de política monetária. O resultado encontrado aponta para a relevância do canal de juros na determinação das assimetrias no nível estadual.
The purpose of this dissertation is to (i) estimate the impact of a monetary policy shock at the Brazilian state level economies and, if they do respond asymmetrically, (ii) to investigate the causes of this heterogeneity. Therefore, Bayesian econometric techniques were used, following Francis et al (2012). These techniques not only overcome the problem of dimensionality, inherent to large size models, but also provide a formal framework to model the uncertainties involving the choice of the appropriate set of covariates. A Bayesian VAR was estimated in order to access the regional responses. The results indicate that the Brazilian state level monetary policy innovation responses are asymmetric. The greatest responses were found at the South and Southeast Regions, while the North Region seems to be insensible to an interest rate shock. The Bayesian Model Averaging technique was implemented to access the determinants of the state level asymmetries. Despite the large degree of uncertainty about the determinants of the response heterogeneity, states with greater share of manufacturing jobs tend to be more sensible to exogenous changes in monetary policy. The results found points to the importance of the interest rate channel in determining Brazilian state level asymmetries.
APA, Harvard, Vancouver, ISO, and other styles
10

Farrell, Patrick John. "Empirical Bayes estimation of small area proportions." Thesis, McGill University, 1991. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=70301.

Full text
Abstract:
Due to the nature of survey design, the estimation of parameters associated with small areas is extremely problematic. In this study, techniques for the estimation of small area proportions are proposed and implemented. More specifically, empirical Bayes estimation methodologies, where random effects which reflect the complex structure of a multi-stage sample design are incorporated into logistic regression models, are derived and studied.
The proposed techniques are applied to data from the 1950 United States Census to predict local labor force participation rates of females. Results are compared with those obtained using unbiased and synthetic estimation approaches.
Using the proposed methodologies, a sensitivity analysis concerning the prior distribution assumption, conducted with a view toward outlier detection, is performed. The use of bootstrap techniques to correct measures of uncertainty is also studied.
APA, Harvard, Vancouver, ISO, and other styles
11

Wong, Brehnen. "The Bayesian economic agent as a mechanism for asset-price bubbles." abstract and full text PDF (free order & download UNR users only), 2008. http://0-gateway.proquest.com.innopac.library.unr.edu/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1460791.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Falconer, Jean. "Essays in Fiscal Policy." Thesis, University of Oregon, 2018. http://hdl.handle.net/1794/23774.

Full text
Abstract:
The subject of this dissertation is fiscal policy in the United States. In recent years the limitations of monetary policy have become more evident, generating greater interest in the use of fiscal policy as a stabilization tool. Despite considerable advances in the fiscal policy literature, many important questions about the effects and implementation of such policy remain unresolved. This motivates the present work, which explores both topics in the chapters that follow. I begin in the second chapter by estimating Federal Reserve responses to changes in taxes and spending. Monetary responses are a critical determinant of fiscal policy effectiveness since central banks have the ability to offset many of the economic changes resulting from fiscal shocks. Using techniques commonly employed in the fiscal multiplier literature, my results indicate a willingness by monetary policymakers to alter policy directly in response to fiscal shocks in a way that either reinforces or counteracts the resulting effects. In the third and fourth chapters I shift my focus to the conduct of fiscal policy. Specifically, I use Bayesian methods to estimate the response of federal discretionary policy to different macroeconomic variables. I allow for uncertainty about various characteristics of the underlying model which enables me to determine, for example, which variables matter to policymakers; whether policy conduct has changed over time; and whether policy responses are state dependent. My results indicate, among other things, that policy responds countercyclically to changes in the labor market, but only during periods of weak economic activity.
APA, Harvard, Vancouver, ISO, and other styles
13

Prüser, Jan [Verfasser], and Christoph [Akademischer Betreuer] Hanck. "Essays in Modeling Fat Time Series Data using Bayesian Econometrics / Jan Prüser ; Betreuer: Christoph Hanck." Duisburg, 2019. http://d-nb.info/1191692493/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Doehr, Rachel M. "Adventures at the Zero Lower Bound: A Bayesian Time-Varying Parameter Vector Autoregressive Analysis of Monetary Policy Uncertainty Shocks." Scholarship @ Claremont, 2016. http://scholarship.claremont.edu/cmc_theses/1318.

Full text
Abstract:
Using survey-based measures of future interest rate expectations from the Blue Chip Economic Indicators and the Survey of Professional Forecasters, we examine the relationship between monetary policy uncertainty, captured as the dispersion of interest rate forecasts, and fluctuations in real economic activity and core inflation. We use a flexible time-varying parameter vector autoregression (TVP-VAR) model to clearly isolate the dynamic effects of shocks to monetary policy uncertainty. To further study possible a possible nonlinear relationship between monetary policy uncertainty and the macroeconomic aggregates, we extract the impulse-response functions (IRF’s) estimated at each quarter in the time series, and use a multi-variate regression with various measures of the shape of the IRF’s and the level of monetary policy uncertainty at that quarter in the TVP-VAR model to gauge the relationship between the effectiveness of traditional monetary policy (shocks to the Federal Funds rate), forward guidance (shocks to expected interest rates) and uncertainty. The results show that monetary policy uncertainty can have a quantitatively significant impact on output, with a one standard deviation shock to uncertainty associated with a 0.6% rise in unemployment. The indirect effects are more substantial, with a one standard deviation increase in monetary policy uncertainty associated with a 23% decrease in the maximum response of unemployment to a forward guidance episode (interest rate expectations shock). This evidence points to the importance of managing monetary policy uncertainty (clear and direct forward guidance) as a key policy tool in both stimulating economic activity as well as propagating other monetary policy through the macroeconomy.
APA, Harvard, Vancouver, ISO, and other styles
15

Li, Yuan. "The new development of econometrics and its applications in financial markets." Diss., Online access via UMI:, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
16

Roberts, Danielle M. "The Resource Curse and Economic Freedom: A Bayesian Perspective." Scholarship @ Claremont, 2015. http://scholarship.claremont.edu/cmc_theses/1132.

Full text
Abstract:
The literature addressing the resource curse has been extensive. Many studies have put forth theories to explain the curse, but these theories are often refuted by new studies. Recently, there has been a theory that natural resource abundance leads to decreased economic freedom, which causes slower economic growth. Many of these studies have using frequentist testing to arrive at their conclusions. Although frequentist testing is widely used, there are several drawbacks. In particular, there is no way of addressing model uncertainty. Unless a study is able to incorporate every significant explanatory variable, the results will suffer from omitted variable bias. Recently, researchers have been applying Bayesian statistics to address the problem of model uncertainty. In this study, we apply Bayesian Model Averaging (BMA) to build a growth model, and see if natural resources have a negative effect on growth. We take the implementation of BMA a step further to see if there is an indirect negative effect of natural resources on economic freedom. However, contrary to previous studies, we were not able to find a negative relationship between resource abundance and economic freedom.
APA, Harvard, Vancouver, ISO, and other styles
17

Ribeiro, Ramos Francisco Fernando, and fr1960@clix pt. "Essays in time series econometrics and forecasting with applications in marketing." RMIT University. Economics, Finance and Marketing, 2007. http://adt.lib.rmit.edu.au/adt/public/adt-VIT20071220.144516.

Full text
Abstract:
This dissertation is composed of two parts, an integrative essay and a set of published papers. The essay and the collection of papers are placed in the context of development and application of time series econometric models in a temporal-axis from 1970s through 2005, with particular focus in the Marketing discipline. The main aim of the integrative essay is on modelling the effects of marketing actions on performance variables, such as sales and market share in competitive markets. Such research required the estimation of two kinds of time series econometric models: multivariate and multiple time series models. I use Autoregressive Integrated Moving Average (ARIMA) intervention models and the Pierce and Haugh statistical test to model the impact of a single marketing instrument, mainly price promotions, to measure own and cross-short term sales effects, and to study asymmetric marketing competition. I develop and apply Vector AutoRegressive (VAR) and Bayesian Vector AutoRegressive (BVAR) models to estimate dynamic relationships in the market and to forecast market share. Especially, BVAR models are advantageous because they contain all relevant dynamic and interactive effects. They accommodate not only classical competitive reaction effects, but also own and cross-market share brand feedback effects and internal decision rules and provided substantively useful insights into the dynamics of demand. The integrative essay is structured in four main parts. The introduction sets the basic ideas behind the published papers, with particular focus on the motivation of the essay, the types of competitive reaction effects analysed, an overview of the time series econometric models in marketing, a short discussion of the basic methodology used in the research and a brief description of the inter-relationships across the published papers and structure of the essay. The discussion is centred on how to model the effects of marketing actions at the selective demand or brand level and at the primary demand or product level. At the brand level I discuss the research contribution of my work on (i) modelling promotional short-term effects of price and non-price actions on sales and market share for consumer packaged goods, with no competition, (ii) how to measure own and cross short-term sales effects of advertising and price, in particular, cross-lead and lag effects, asymmetric sales behaviour and competition without retaliatory actions, in an automobile market, (iii) how to model the marketing-mix effectiveness at the short and long-term on market shares in a car market, (iv) what is the best method to forecast market share, and (v) the study of causal linkages at different time horizons between sales and marketing activity for a particular brand. At the product or commodity level, I propose a way to model the flows of tourists that come from different origins (countries) to the same country-destination as market segments defining the primary demand of a commodity - the product
APA, Harvard, Vancouver, ISO, and other styles
18

PACCAGNINI, ALESSIA. "Model validation in the DSGE approach." Doctoral thesis, Universita' Bocconi Milano, 2009. http://hdl.handle.net/10281/13792.

Full text
Abstract:
The purpose of this thesis is to discuss the introduction and the implementation of the idea of model validation, especially in the use of Dynamic Stochastic General Equilibrium (DSGE) models. In this discussion, the mixture models are presented as the recent econometrics tool used in model validation. Two examples of DSGE models are illustrated in order to introduce two problems: omitted variables within the statistical identification problem and the finite-order representation by a Vector Autoregressive (VAR) of a DSGE model. The paper concludes the review considering some pointers for the future research and for the further developments of the use of mixture models for model validation.
APA, Harvard, Vancouver, ISO, and other styles
19

Nunes, André Francisco Nunes de. "Três ensaios sobre intermediação financeira em modelos DSGE aplicados ao Brasil." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2015. http://hdl.handle.net/10183/132999.

Full text
Abstract:
Esta tese é composta por três ensaios sobre a estimação bayesiana de modelos DSGE com fricções financeiras para o Brasil. O primeiro ensaio tem o objetivo de analisar como a incorporação de intermediários financeiros num modelo DSGE influenciam na análise do ciclo econômico, bem como uma política de crédito pode ser utilizada para mitigar os choques no mercado de crédito sobre a atividade. O governo brasileiro expandiu o crédito na economia através das instituições financeiras públicas tendo como custo o aumento da dívida pública. Para isso, foi estimado um modelo inspirado em Gertler e Karadi (2011) para avaliar o comportamento da economia brasileira sob a influência de uma política de crédito. Política de crédito mostrou-se efetiva para mitigar os efeitos recessivos de uma crise financeira que atinja a cotação dos ativos privados ou o patrimônio das instituições financeiras. Contudo, a política monetária tradicional se mostrou mais eficiente para a estabilização da inflação em momentos de normalidade. O segundo ensaio consiste na estimação de um modelo DSGE-VAR para a economia brasileira. A parte DSGE consiste em uma economia pequena, aberta e com fricções financeiras na linha de Gertler, Gilchrist e Natalucci (2007). A estimação do modelo indicou que flexibilização do espaço paramétrico possibilitado pelo modelo DSGE-VAR proporcionou ganhos em relação ao ajuste aos dados em relação a modelos alternativos. O exercício também obteve indicações de que os choques externos apresentam impactos significativos no patrimônio e no endividamento das firmas domésticas. Esse resultado fortalece a evidência de que um canal importante de transmissão dos movimentos da economia mundial para a o Brasil ocorre através das firmas. Por fim, no terceiro ensaio tem como foco a transmissão dos choques no spread de crédito bancário para as demais variáveis da economia e suas implicações para a condução da política monetária no brasil. Para isso, foi estimado um modelo DSGE com fricções financeiras para a economia brasileira. O modelo é baseado em Cúrdia e Woodford (2010), que propuseram uma extensão do modelo de Woodford (2003) para incorporar a existência de um diferencial entre a taxa de juros disponíveis aos poupadores e tomadores de empréstimos, que pode variar por razões tanto endógenas quanto exógenos. Nessa economia, a política monetária pode responder não somente às variações na taxa de inflação e hiato do produto através de uma regra simples, como também por meio de uma regra ajustada pelo spread de crédito da economia. Os resultados mostram que a inclusão do spread de crédito no modelo Novo Keynesiano não altera significativamente as conclusões dos modelos DSGE em respostas a perturbações exógenas tradicionais, como choques na taxa de juros, na produtividade da economia e no dispêndio público. Porém, nos eventos que ocasionam a deterioração da intermediação financeira, por meio de choques exógenos sobre o spread de crédito, o impacto sobre o ciclo econômico foi significativo e a adoção de uma regra de política monetária ajustada pelo spread pode conseguir estabilizar a economia mais rapidamente do que uma regra tradicional.
The present thesis is a collection of three essays on Bayesian estimation of DSGE models with financial frictions in the Brazilian economy. The first essay intends to investigate how the incorporation of financial intermediaries in a DSGE model influences the analysis of the economic cycle, as well as how the credit policy can be employed to mitigate the effects of shocks in the credit market on the economic activity. The Brazilian government expanded the credit in the economy through public financial institutions, which resulted in an increase of public debt. it estimated a model inspired by Gertler and Karadi (2011) to evaluate the performance of the Brazilian economy under the influence of a credit policy. Credit policy was effective to mitigate the recessionary effects of a financial crisis that affects the valuation of private assets and the net worth of financial institutions. However, the traditional monetary policy was more efficient for the stabilization of inflation in times of normality. The second essay consist of a DSGE-VAR model for the Brazilian economy. The DSGE model was estimated for a small, open economy with financial frictions, in line with Gertler, Gilchrist and Natalucci (2007). The results indicates that the estimation of DSGE-VAR provides an advantage for the data fitting in comparison to alternative models. In addition, the results indicate that external shocks have significant impacts in the equity and debt of domestic firms. This result strengthens (supports) the evidence that an important channel of transmission of the movements of the world economy for the Brazil takes place through productive sector. The third essay analyze the transmission of shocks in the banking credit spread for the other variables of the economy and its implications for the conduct of monetary policy in Brazil. We do so by estimating a DSGE model with financial frictions for the Brazilian economy. The model is based on Cúrdia and Woodford (2010), who proposed an extension of the model Woodford (2003) to incorporate the existence of a differential between the interest rates available to savers and borrowers, which can vary by both endogenous and exogenous reasons. In this model, monetary policy can respond not only to changes in the inflation rate and output gap through a simple rule, but also through a rule set by the credit spread of the economy. The results show that the inclusion of credit spread in the New Keynesian model does not significantly changes the conclusions of DSGE models in traditional responses to exogenous shocks, such as shocks in the interest rate, in the productivity of the economy and in public spending. However, in the events that cause the deterioration of financial intermediation through exogenous shocks on the credit spread, the impact on the business cycle was significant and the adoption of a monetary policy rule set by the spread can achieve a faster stabilization of the economy than a traditional rule.
APA, Harvard, Vancouver, ISO, and other styles
20

Ndoye, Abdoul Aziz Junior. "Essays on the econometrics of inequality and poverty measurements." Thesis, Aix-Marseille, 2013. http://www.theses.fr/2013AIXM1125.

Full text
Abstract:
Cette thèse est composée de quatre essais sur l'économétrie des mesures d'inégalité et de pauvreté. Elle fournit un traitement statistique fondé sur l'analyse de modèles probabilistes de mélange fini de distributions et de modèle de régression quantile, le tout dans une approche Bayésienne.Le deuxième chapitre s'intéresse à la modélisation d'une distribution de revenus par un mélange fini de lois log-normales dont les paramètres sont estimés par la méthode d'échantillonnage de Gibbs. Ce chapitre propose une méthode d'inférence statistique pour certains indices d'inégalité par une Rao-Blackwellisation de l'échantillonnage de Gibbs. Le troisième chapitre propose une estimation Bayésienne de la récente régression quantile non-conditionnelle basée sur la fonction d'influence recentrée (regression RIF) dans laquelle la densité est estimée par un mélange de lois normales. De cette approche, on déduit une inférence Bayesienne pour la méthode de décomposition d'Oaxaca-Blinder. La méthode proposée est utilisée pour analyser la dispersion des salaires aux Etats-Unis entre 1992-2009.Le quatrième chapitre propose une inférence Bayésienne d'un mélange de deux lois de Pareto simples pour modéliser la partie supérieure d'une distribution de salaires. Cette approche est utilisée pour analyser la répartition des hauts salaires aux Etats-Unis afin de tester les deux modèles (Tournoi et Superstar). Le cinquième chapitre de la thèse est consacré à l'analyse des rendements privés de l'éducation sur le revenu des ménages et des inégalités entre les populations urbaines et rurales. Il considère le cas du Sénégal et utilise les dépenses totales de consommation comme indicateur du revenu
This dissertation consists of four essays on the econometrics of inequality and poverty measurement. It provides a statistical analysis based on probabilistic models, finite mixture distributions and quantile regression models, all using aBayesian approach.Chapter 2 models income distribution using a mixture of lognormal densities. Using the analytical expression of inequality indices, it shows how a Rao-Blackwellised Gibbs sampler can lead to accurate inference on income inequality measurements even in small samples.Chapter 3 develops Bayesian inference for the unconditional quantile regression model based on the Re-centered Influence Function (RIF). It models the considered distribution by a mixture of lognormal densities and then provides conditional posterior densities for the quantile regression parameters. This approach is perceived to provide better estimates in the extreme quantiles in the presence of heavy tails as well as valid small sample confidence intervalsfor the Oaxaca-Blinder decomposition.Chapter 4 provides Bayesian inference for a mixture of two Pareto distributions which is then used to approximate the upper tail of a wage distribution. This mixture model is applied to the data from the CPS ORG to analyze the recent structure of top wages in the U.S. from 1992 through 2009. Findings are largely in accordance with the explanations combining the model of superstars and the model of tournaments in hierarchical organization structures. Chapter 5 makes use of the RIF-regression to measure both changes in the return to education across quantiles and rural urban inequality decomposition in consumption expenditure in Senegal
APA, Harvard, Vancouver, ISO, and other styles
21

Nguyen, Trong Nghia. "Deep Learning Based Statistical Models for Business and Financial Data." Thesis, The University of Sydney, 2021. https://hdl.handle.net/2123/26944.

Full text
Abstract:
We investigate a wide range of statistical models commonly used in many business and financial econometrics applications and propose flexible ways to combine these highly interpretable models with powerful predictive models in the deep learning literature to leverage the advantages and compensate the disadvantages of each of the modelling approaches. Our approaches of utilizing deep learning techniques for financial data are different from the recently proposed deep learning-based models in the financial econometrics literature in several perspectives. First, we do not overlook well-established structures that have been successfully used in statistical modelling. We flexibly incorporate deep learning techniques to the statistical models to capture the data effects that cannot be explained by the simple linear components of those models. Our proposed modelling frameworks therefore normally include two components: a linear part to explain linear dependencies and a deep learning-based part to capture data effects rather than linearity possibly exhibited in the underlying process. Second, we do not use the neural network structures in the same fashion as they are implemented in the deep learning literature but modify those black-box methods to make them more explainable and hence improve the interpretability of the proposed models. As the results, our hybrid models not only perform better than the pure deep learning techniques in term of interpretation but also often produce more accurate out-of-sample forecasts than the counterpart statistical frameworks. Third, we propose advanced Bayesian inference methodologies to efficiently quantify the uncertainty about the model estimation and prediction. For the proposed high dimensional deep learning-based models, performing efficient Bayesian inference is extremely challenging and is often ignored in the engineer-oriented papers, which generally prefer the frequentist estimation approaches mainly due to the simplicity.
APA, Harvard, Vancouver, ISO, and other styles
22

Salabasis, Mickael. "Bayesian time series and panel models : unit roots, dynamics and random effects." Doctoral thesis, Stockholm : Economic Research Institute, Stockholm School of Economics (Ekonomiska forskningsinstitutet vid Handelshögsk.) (EFI), 2004. http://www.hhs.se/efi/summary/632.htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Rocio, Vitor Dias. "Um modelo espaço-temporal contínuo para o preço de lançamentos imobiliários na cidade de São Paulo." Universidade de São Paulo, 2018. http://www.teses.usp.br/teses/disponiveis/96/96131/tde-03082018-105129/.

Full text
Abstract:
Neste trabalho será feito um modelo espaço-temporal contínuo para preços de imóveis na cidade de São Paulo estimado através de métodos Bayesianos. Faremos uma decomposição da série em tendência e ciclo além de incorporar um conjunto de variáveis explicativas e efeitos aleatórios espaciais projetados no contínuo. Este modelo introduz um novo método para analisar a formação dos preços dos lançamentos imobiliários. Consideramos em nosso modelo hedônico, além das características intrínsecas, também as características da vizinhança e o ambiente econômico. Com este modelo, conseguimos observar os preços de equilíbrio para as respectivas localizações e uma interpretação mais clara da dinâmica de preços dos imóveis entre janeiro de 2000 e dezembro de 2013 para a cidade de São Paulo.
In this work will be made a continuous spatial-temporal model for real estate prices in the city of São Paulo estimated using Bayesian methods. We will decompose the series into a trend and cycle, and incorporate a set of explanatory variables and random spatial effects projected into the continuum. This model introduces a new method to analyze the price formation of real estate launches. We consider in our hedonic model, besides the intrinsic characteristics, also the characteristics of the neighborhood and the economic environment. With this model, we were able to observe the equilibrium prices for the respective locations and a clearer interpretation of the dynamics of real estate prices between January 2000 and December 2013 for the city of São Paulo.
APA, Harvard, Vancouver, ISO, and other styles
24

PACCAGNINI, ALESSIA. "Model validation in the DSGE approach." Doctoral thesis, Università Bocconi, 2009. https://hdl.handle.net/11565/4053466.

Full text
Abstract:
The purpose of this thesis is to discuss the introduction and the implementation of the idea of model validation, especially in the use of Dynamic Stochastic General Equilibrium (DSGE) models. In this discussion, the mixture models are presented as the recent econometrics tool used in model validation. Two examples of DSGE models are illustrated in order to introduce two problems: omitted variables within the statistical identification problem and the finite-order representation by a Vector Autoregressive (VAR) of a DSGE model. The paper concludes the review considering some pointers for the future research and for the further developments of the use of mixture models for model validation.
APA, Harvard, Vancouver, ISO, and other styles
25

Parmler, Johan. "Essays in empirical asset pricing." Doctoral thesis, Stockholm : Economic Research Institute (EFI), Stockholm School of Economics, 2005. http://www.hhs.se/efi/summary/691.htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Silvestrini, Andrea. "Essays on aggregation and cointegration of econometric models." Doctoral thesis, Universite Libre de Bruxelles, 2009. http://hdl.handle.net/2013/ULB-DIPOT:oai:dipot.ulb.ac.be:2013/210304.

Full text
Abstract:
This dissertation can be broadly divided into two independent parts. The first three chapters analyse issues related to temporal and contemporaneous aggregation of econometric models. The fourth chapter contains an application of Bayesian techniques to investigate whether the post transition fiscal policy of Poland is sustainable in the long run and consistent with an intertemporal budget constraint.

Chapter 1 surveys the econometric methodology of temporal aggregation for a wide range of univariate and multivariate time series models.

A unified overview of temporal aggregation techniques for this broad class of processes is presented in the first part of the chapter and the main results are summarized. In each case, assuming to know the underlying process at the disaggregate frequency, the aim is to find the appropriate model for the aggregated data. Additional topics concerning temporal aggregation of ARIMA-GARCH models (see Drost and Nijman, 1993) are discussed and several examples presented. Systematic sampling schemes are also reviewed.

Multivariate models, which show interesting features under temporal aggregation (Breitung and Swanson, 2002, Marcellino, 1999, Hafner, 2008), are examined in the second part of the chapter. In particular, the focus is on temporal aggregation of VARMA models and on the related concept of spurious instantaneous causality, which is not a time series property invariant to temporal aggregation. On the other hand, as pointed out by Marcellino (1999), other important time series features as cointegration and presence of unit roots are invariant to temporal aggregation and are not induced by it.

Some empirical applications based on macroeconomic and financial data illustrate all the techniques surveyed and the main results.

Chapter 2 is an attempt to monitor fiscal variables in the Euro area, building an early warning signal indicator for assessing the development of public finances in the short-run and exploiting the existence of monthly budgetary statistics from France, taken as "example country".

The application is conducted focusing on the cash State deficit, looking at components from the revenue and expenditure sides. For each component, monthly ARIMA models are estimated and then temporally aggregated to the annual frequency, as the policy makers are interested in yearly predictions.

The short-run forecasting exercises carried out for years 2002, 2003 and 2004 highlight the fact that the one-step-ahead predictions based on the temporally aggregated models generally outperform those delivered by standard monthly ARIMA modeling, as well as the official forecasts made available by the French government, for each of the eleven components and thus for the whole State deficit. More importantly, by the middle of the year, very accurate predictions for the current year are made available.

The proposed method could be extremely useful, providing policy makers with a valuable indicator when assessing the development of public finances in the short-run (one year horizon or even less).

Chapter 3 deals with the issue of forecasting contemporaneous time series aggregates. The performance of "aggregate" and "disaggregate" predictors in forecasting contemporaneously aggregated vector ARMA (VARMA) processes is compared. An aggregate predictor is built by forecasting directly the aggregate process, as it results from contemporaneous aggregation of the data generating vector process. A disaggregate predictor is a predictor obtained from aggregation of univariate forecasts for the individual components of the data generating vector process.

The econometric framework is broadly based on Lütkepohl (1987). The necessary and sufficient condition for the equality of mean squared errors associated with the two competing methods in the bivariate VMA(1) case is provided. It is argued that the condition of equality of predictors as stated in Lütkepohl (1987), although necessary and sufficient for the equality of the predictors, is sufficient (but not necessary) for the equality of mean squared errors.

Furthermore, it is shown that the same forecasting accuracy for the two predictors can be achieved using specific assumptions on the parameters of the VMA(1) structure.

Finally, an empirical application that involves the problem of forecasting the Italian monetary aggregate M1 on the basis of annual time series ranging from 1948 until 1998, prior to the creation of the European Economic and Monetary Union (EMU), is presented to show the relevance of the topic. In the empirical application, the framework is further generalized to deal with heteroskedastic and cross-correlated innovations.

Chapter 4 deals with a cointegration analysis applied to the empirical investigation of fiscal sustainability. The focus is on a particular country: Poland. The choice of Poland is not random. First, the motivation stems from the fact that fiscal sustainability is a central topic for most of the economies of Eastern Europe. Second, this is one of the first countries to start the transition process to a market economy (since 1989), providing a relatively favorable institutional setting within which to study fiscal sustainability (see Green, Holmes and Kowalski, 2001). The emphasis is on the feasibility of a permanent deficit in the long-run, meaning whether a government can continue to operate under its current fiscal policy indefinitely.

The empirical analysis to examine debt stabilization is made up by two steps.

First, a Bayesian methodology is applied to conduct inference about the cointegrating relationship between budget revenues and (inclusive of interest) expenditures and to select the cointegrating rank. This task is complicated by the conceptual difficulty linked to the choice of the prior distributions for the parameters relevant to the economic problem under study (Villani, 2005).

Second, Bayesian inference is applied to the estimation of the normalized cointegrating vector between budget revenues and expenditures. With a single cointegrating equation, some known results concerning the posterior density of the cointegrating vector may be used (see Bauwens, Lubrano and Richard, 1999).

The priors used in the paper leads to straightforward posterior calculations which can be easily performed.

Moreover, the posterior analysis leads to a careful assessment of the magnitude of the cointegrating vector. Finally, it is shown to what extent the likelihood of the data is important in revising the available prior information, relying on numerical integration techniques based on deterministic methods.


Doctorat en Sciences économiques et de gestion
info:eu-repo/semantics/nonPublished

APA, Harvard, Vancouver, ISO, and other styles
27

Welz, Peter. "Quantitative New Keynesian Macroeconomics and Monetary Policy." Doctoral thesis, Uppsala : Department of Economics, Uppsala University, 2005. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-5978.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Afghari, Amir Pooyan. "Detecting motor vehicle crash blackspots based on their underlying behavioural, engineering, and spatial causes." Thesis, University of Queensland, 2019. https://eprints.qut.edu.au/127653/1/127653.pdf.

Full text
Abstract:
The state of the practice in crash blackspot identification (BSI) has largely been driven by empirical research without much explicit attention paid to the underlying theoretical assumptions. These embedded assumptions have shaped the science of blackspot identification methodologies and developments over time. Despite the fairly extensive methodological enhancements made during the past five decades, little attention has been paid to reviewing, questioning and possibly revising these underlying theoretical assumptions. The theoretical assumptions underlying blackspot identification include: 1) crash risk can be adequately captured by the total number of crashes at a site divided by the amount of sites’ exposure to potential crashes, 2) designed roads pose differential crash risk to motorists arising from observed (operational) features of the transport network, and 3) crashes are the outcomes of a single source of risk at a site. This doctoral dissertation first reviews the theoretical assumptions underlying blackspot identification, raises fundamental questions about these theoretical assumptions and presents the associated gaps in the blackspot identification literature. These gaps include: 1) non-operational crash contributing factors and their unobserved effects have not been explicitly incorporated into the BSI, and 2) crashes may not be the outcomes of a single source of risk, but rather may be the outcomes of multiple sources of risk at a site. This focus on the underlying theory evolution, its influence on empirical work, and its reflection on remaining theory gaps serves as one of the unique contributions of this research to the literature. A more accurate underlying mechanism for explaining motor vehicle crash causation is then hypothesized as a potential solution to address the research gaps. Stated succinctly, the current theoretical assumption underlying BSI is that crashes are well-approximated by a single source of risk, wherein several contributing factors exert their collective, non-independent influences on the occurrence of crashes via a linear predictor. This PhD study first postulates, and then demonstrates empirically, that crash occurrence may be more complex than can be adequately captured by a single source of risk. It is hypothesized that the total observed crash count at a transport network location is generated by multiple underlying, simultaneous and inter-dependent sources of risk, rather than one. Each of these sources may uniquely contribute to the total observed crash count. For instance, a site’s crash occurrence may be dominated by contributions from driver behaviour issues (e.g. speeding, impaired driving), while another site’s crashes might arise predominately from design and operational deficiencies such as deteriorating pavements and worn lane markings. A multiple risk source methodology is developed to correspond with and empirically test this hypothesis. Two modelling approaches are then used to show the applicability of the multiple risk source methodology: 1) Bayesian latent mixture model, and 2) joint econometric model with random parameters and instrumental variables. Finally, the severity of crashes is explicitly incorporated into the multiple risk source methodology by extending the multiple risk source model to a joint model of crash count and crash severity. To test the viability of the methodological framework, all models are applied to a comprehensive dataset for the state controlled roads in Queensland, Australia and the results are compared with the traditional approaches. The results show that the new multiple risk source models outperform the traditional single risk source models in terms of prediction performance and goodness of fit. In addition, the multiple risk source models are able to provide more insight into crash contributing factors, their impact on the total crash count and their impact on the crash count proportions generated by each risk source. It is found that the parameters of the joint model of crash count and crash severity are moderated by the correlation between these two models and therefore, the total risk at a site can be adequately recognized by crash count and severity, simultaneously. Over all, the findings of this research indicate that decomposing the total crash count into its constituent components, separating the risk sources and incorporating crash severity into the overall framework leads to efficient, cost-effective identification of crash blackspots.
APA, Harvard, Vancouver, ISO, and other styles
29

Wang, Chao. "The relationship between traffic congestion and road accidents : an econometric approach using GIS." Thesis, Loughborough University, 2010. https://dspace.lboro.ac.uk/2134/6207.

Full text
Abstract:
Both traffic congestion and road accidents impose a burden on society, and it is therefore important for transport policy makers to reduce their impact. An ideal scenario would be that traffic congestion and accidents are reduced simultaneously, however, this may not be possible since it has been speculated that increased traffic congestion may be beneficial in terms of road safety. This is based on the premise that there would be fewer fatal accidents and the accidents that occurred would tend to be less severe due to the low average speed when congestion is present. If this is confirmed then it poses a potential dilemma for transport policy makers: the benefit of reducing congestion might be off-set by more severe accidents. It is therefore important to fully understand the relationship between traffic congestion and road accidents while controlling for other factors affecting road traffic accidents. The relationship between traffic congestion and road accidents appears to be an under researched area. Previous studies often lack a suitable congestion measurement and an appropriate econometric model using real-world data. This thesis aims to explore the relationship between traffic congestion and road accidents by using an econometric and GIS approach. The analysis is based on the data from the M25 motorway and its surrounding major roads for the period 2003-2007. A series of econometric models have been employed to investigate the effect of traffic congestion on both accident frequency (such as classical Negative Binomial and Bayesian spatial models) and accident severity (such as ordered logit and mixed logit models). The Bayesian spatial model and the mixed logit model are the best models estimated for accident frequency and accident severity analyses respectively. The model estimation results suggest that traffic congestion is positively associated with the frequency of fatal and serious injury accidents and negatively (i.e. inversely) associated with the severity of accidents that have occurred. Traffic congestion is found to have little impact on the frequency of slight injury accidents. Other contributing factors have also been controlled for and produced results consistent with previous studies. It is concluded that traffic congestion overall has a negative impact on road safety. This may be partially due to higher speed variance among vehicles within and between lanes and erratic driving behaviour in the presence of congestion. The results indicate that mobility and safety can be improved simultaneously, and therefore there is significant additional benefit of reducing traffic congestion in terms of road safety. Several policy implications have been identified in order to optimise the traffic flow and improve driving behaviour, which would be beneficial to both congestion and accident reduction. This includes: reinforcing electronic warning signs and the Active Traffic Management, enforcing average speed on a stretch of a roadway and introducing minimum speed limits in the UK. This thesis contributes to knowledge in terms of the relationship between traffic congestion and road accidents, showing that mobility and safety can be improved simultaneously. A new hypothesis is proposed that traffic congestion on major roads may increase the occurrence of serious injury accidents. This thesis also proposes a new map-matching technique so as to assign accidents to the correct road segments, and shows how a two-stage modelling process which combines both accident frequency and severity models can be used in site ranking with the objective of identifying hazardous accident hotspots for further safety examination and treatment.
APA, Harvard, Vancouver, ISO, and other styles
30

Ormeño, Sánchez Arturo. "Essays on Inflation Expectations, Heterogeneous Agents, and the Use of Approximated Solutions in the Estimation of DSGE models." Doctoral thesis, Universitat Pompeu Fabra, 2011. http://hdl.handle.net/10803/51247.

Full text
Abstract:
In this thesis I evaluate the departures of three common assumptions in macroeconomic modeling and estimation, namely the Rational Expectations (RE) hypothesis, the representative agent assumption and the use of first-order approximations in the estimation of dynamic stochastic general equilibrium (DSGE) models. In the first chapter I determine how the use of survey data on inflation expectations in the estimation of a model alters the evaluation of the RE assumption in comparison to an alternative assumption, namely learning. In chapter two, I use heterogeneous agent models to determine the relationship between income volatility and the demand for durable goods. In the third chapter I evaluate if the use of first-order approximations in the estimation of a model could affect the evaluation of the determinants of the Great Moderation.
En esta tesis analizo desvíos de tres supuestos comunes en la elaboración y estimación de modelos macroeconómicos. Estos supuestos son la Hipótesis de Expectativas Racionales (ER), el supuesto del Agente Representativo, y el uso de aproximaciones de primer orden en la estimación de los modelos de equilibrio general. En el primer capítulo determino como el empleo de datos de expectativas de inflación en la estimación de un modelo puede alterar la evaluación del supuesto de ER en comparación a un supuesto alternativo como learning. En el segundo capítulo, utilizo modelos de agentes heterogéneos para determinar la relación entre la volatilidad de los ingresos y la demanda de bienes durables. En el tercer capítulo, analizo si el uso de aproximaciones de primer orden afecta la evaluación de los determinantes de la Gran Moderación.
APA, Harvard, Vancouver, ISO, and other styles
31

Lau, Wai Kwong. "Bayesian nonparametric methods for some econometric problems /." View abstract or full-text, 2005. http://library.ust.hk/cgi/db/thesis.pl?ISMT%202005%20LAU.

Full text
APA, Harvard, Vancouver, ISO, and other styles
32

Wu, Ruochen. "Essays on semi-parametric Bayesian econometric methods." Thesis, University of Cambridge, 2019. https://www.repository.cam.ac.uk/handle/1810/288745.

Full text
Abstract:
This dissertation consists of three chapters on semi-parametric Bayesian Econometric methods. Chapter 1 applies a semi-parametric method to demand systems, and compares the abilities to recover the true elasticities of different approaches to linearly estimating the widely used Almost Ideal demand model, by either iteration or approximation. Chapter 2 co-authored with Dr. Melvyn Weeks introduces a new semi-parametric Bayesian Generalized Least Square estimator, which employs the Dirichlet Process prior to cope with potential heterogeneity in the error distributions. Two methods are discussed as special cases of the GLS estimator, the Seemingly Unrelated Regression for equation systems, and the Random Effects Model for panel data, which can be applied to many fields such as the demand analysis in Chapter 1. Chapter 3 focuses on the subset selection for the efficiencies of firms, which addresses the influence of heterogeneity in the distributions of efficiencies on subset selections by applying the semi-parametric Bayesian Random Effects Model introduced in Chapter 2.
APA, Harvard, Vancouver, ISO, and other styles
33

Wu, Jingtao. "Three Bayesian econometric studies on forecast evaluation." [Ames, Iowa : Iowa State University], 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
34

Norets, Andriy. "Bayesian inference in dynamic discrete choice models." Diss., University of Iowa, 2007. http://ir.uiowa.edu/etd/148.

Full text
APA, Harvard, Vancouver, ISO, and other styles
35

Pole, A. M. "Bayesian analysis of some threshold switching models." Thesis, University of Nottingham, 1985. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.356040.

Full text
APA, Harvard, Vancouver, ISO, and other styles
36

Fonseca, Marcelo Gonçalves da Silva. "Essays on the credit channel of monetary policy: a case study for Brazil." reponame:Repositório Institucional do FGV, 2014. http://hdl.handle.net/10438/11748.

Full text
Abstract:
Submitted by Marcelo Fonseca (marcelo.economista@hotmail.com) on 2014-05-19T19:10:06Z No. of bitstreams: 1 Essays on the Credit Channel of Monetary Policy - a Case Study for Brazil.pdf: 3704297 bytes, checksum: 3b1fcaf85bbcf74f3843e7c2c0d1cad9 (MD5)
Rejected by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br), reason: Boa tarde Marcelo, conforme conversamos ao telefone. Att. Suzi 3799-7876 on 2014-05-19T19:47:10Z (GMT)
Submitted by Marcelo Fonseca (marcelo.economista@hotmail.com) on 2014-05-19T21:20:48Z No. of bitstreams: 1 Essays on the Credit Channel of Monetary Policy - a Case Study for Brazil.pdf: 3702737 bytes, checksum: 106ac090d0a4805c2b0d31d85182e2eb (MD5)
Approved for entry into archive by Suzinei Teles Garcia Garcia (suzinei.garcia@fgv.br) on 2014-05-20T11:36:27Z (GMT) No. of bitstreams: 1 Essays on the Credit Channel of Monetary Policy - a Case Study for Brazil.pdf: 3702737 bytes, checksum: 106ac090d0a4805c2b0d31d85182e2eb (MD5)
Made available in DSpace on 2014-05-20T11:38:51Z (GMT). No. of bitstreams: 1 Essays on the Credit Channel of Monetary Policy - a Case Study for Brazil.pdf: 3702737 bytes, checksum: 106ac090d0a4805c2b0d31d85182e2eb (MD5) Previous issue date: 2014-05-06
O estouro da crise do subprime em 2008 nos EUA e da crise soberana europeia em 2010 renovou o interesse acadêmico no papel desempenhado pela atividade creditícia nos ciclos econômicos. O propósito desse trabalho é apresentar evidências empíricas acerca do canal do crédito da política monetária para o caso brasileiro, usando técnicas econométricas distintas. O trabalho é composto por três artigos. O primeiro apresenta uma revisão da literatura de fricções financeiras, com especial ênfase nas suas implicações sobre a condução da política monetária. Destaca-se o amplo conjunto de medidas não convencionais utilizadas pelos bancos centrais de países emergentes e desenvolvidos em resposta à interrupção da intermediação financeira. Um capítulo em particular é dedicado aos desafios enfrentados pelos bancos centrais emergentes para a condução da política monetária em um ambiente de mercado de capitais altamente integrados. O segundo artigo apresenta uma investigação empírica acerca das implicações do canal do crédito, sob a lente de um modelo FAVAR estrutural (SFAVAR). O termo estrutural decorre da estratégia de estimação adotada, a qual possibilita associar uma clara interpretação econômica aos fatores estimados. Os resultados mostram que choques nas proxies para o prêmio de financiamento externo e o volume de crédito produzem flutuações amplas e persistentes na inflação e atividade econômica, respondendo por mais de 30% da decomposição de variância desta no horizonte de três anos. Simulações contrafactuais demonstram que o canal do crédito amplificou a contração econômica no Brasil durante a fase aguda da crise financeira global no último trimestre de 2008, produzindo posteriormente um impulso relevante na recuperação que se seguiu. O terceiro artigo apresenta estimação Bayesiana de um modelo DSGE novo-keynesiano que incorpora o mecanismo de acelerador financeiro desenvolvido por Bernanke, Gertler e Gilchrist (1999). Os resultados apresentam evidências em linha com aquelas obtidas no artigo anterior: inovações no prêmio de financiamento externo – representado pelos spreads de crédito – produzem efeitos relevantes sobre a dinâmica da demanda agregada e inflação. Adicionalmente, verifica-se que choques de política monetária são amplificados pelo acelerador financeiro. Palavras-chave: Macroeconomia, Política Monetária, Canal do Crédito, Acelerador Financeiro, FAVAR, DSGE, Econometria Bayesiana
APA, Harvard, Vancouver, ISO, and other styles
37

Lecumberry, Julien. "Transmission des chocs spéculatifs et effets asymétriques." Thesis, Rennes 1, 2014. http://www.theses.fr/2014REN1G011/document.

Full text
Abstract:
Fin 2008, la faillite de Lehmann Brother fait basculer les économies développées dans une crise économique qui se propagea brutalement à l'ensemble du monde. Connu sous le nom de Grande Récession, cet épisode a depuis contribué à raviver les inquiétudes relatives aux déséquilibres sur les marchés financiers. S'inscrivant dans ce contexte, cette thèse tente d'apporter un éclairage nouveau aux effets macroéconomiques de déséquilibres sur un marché particulier : le marché des actions. Après avoir répondu à l'étape préliminaire consistant à définir la composante spéculative, nous explorons ses canaux de transmission et tentons de mettre en lumière la présence d'effets asymétriques. De façon générale, deux questions fondamentales sont ainsi posées. Premièrement, nous cherchons à déterminer si la présence de déséquilibres sur le marché des actions est susceptible de déstabiliser l'activité économique. Deuxièmement, nous testons l'apport informationnel de la composante spéculative pour la prévision des conditions économiques futures. Nos travaux empiriques attestent finalement que les mouvements spéculatifs impactent significativement l'activité économique et montrent que ces répercussions sont en moyenne néfastes pour la sphère réelle. Les effets bénéfiques et négatifs ne se compensent pas. Ce dernier résultat est en partie confirmé par l'analyse du pouvoir prédictif de la composante spéculative
Fall 2008, the bankruptcy of Lehman Brother led a part of world to a severe economic crisis. Also known as "Great Recession", this episode contributed to ravive apprehension about financial imbalances. In this context, we attempt to analyze the macroeconomic effects of the non-fundamental component of stock price. Overall, the thesis focuses on two questions. First, we investigate the macroeconomic effects of this component and pay a particular attention to asymmetry. Second, we examine whether share price misalignments contain leading information about gross domestic product (GDP). In order to deal with these issues, we first have to define the non-fundamental component of stock prices. Using recent econometric methodologies, we explicitly show that the speculative component has significant effects on real economy. Furthermore, the impact of a negative shock is larger that of a positive shock. Volatility of stock prices is found to be an explanation for this asymmetry. Our results also suggest that the speculative component is useful for predicting GDP
APA, Harvard, Vancouver, ISO, and other styles
38

Petteno', Michele <1985&gt. "Italian Electricity Prices: a Bayesian Approach." Master's Degree Thesis, Università Ca' Foscari Venezia, 2012. http://hdl.handle.net/10579/2193.

Full text
Abstract:
The Italian Power System is characterized by the presence of six electricity zones. The aim of the thesis is to evaluate zonal interdependence, from a cross section and time variations point of view. Given the high number of parameters to be examined, the computing tool used is the Bayesian approach, which is useful to avoid the problem of over-parameterization. The model is applied to equilibrium electricity spot prices of the Italian Wholesale Market; lagged prices and zonal temperatures are included as independent variables. “Fourier Spectral Analysis” is another important tool used to overcome the presence of seasonalities.
APA, Harvard, Vancouver, ISO, and other styles
39

Hauer, Mariana. "Os modelos VAR e VEC espaciais : uma abordagem bayesiana." reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2007. http://hdl.handle.net/10183/12585.

Full text
Abstract:
O objetivo deste trabalho é apresentar o Modelo Vetorial Autorregressivo (VAR) e uma das suas variações, o Modelo Vetorial de Correções de Erros (VEC), segundo uma abordagem Bayesiana, considerando componentes regionais, que serão inseridos nos modelos apresentados através de informações a priori que levam em consideração a localização dos dados. Para formar tais informações a priori são utilizados conceitos referentes à econometria espacial, como por exemplo, as relações de contigüidade e as implicações que estas trazem. Como exemplo ilustrativo, o modelo em questão será aplicado a um conjunto de dados regionais, coletados por estados brasileiros. Este conjunto de dados consiste em observações da variável produção industrial para oito estados, no período de janeiro de 1991 a setembro de 2006. Em função da escolha do modelo adequado, a questão central foi descobrir em que medida a incorporação destas informações a priori no modelo VEC Bayesiano é coerente quando estimamos modelos que consideram informações localizacionais.
The main goal of this work is to present the Vector Autoregressive Model (VAR) and one of its variations, the Vector Error Correction Model (VEC), according to a Bayesian variant, considering regional components that will be inserted in the models presented through prior information, which takes in consideration the data localization. To form such prior information, spatial econometrics is used, as for example the contiguity relations and the implications that these bring to the modeling. As illustrative example, the model in question will be applied to a regional data set, collected for Brazilian states. This data set consists of industrial production for eight states, in the period between January 1991 and September 2006. The central question is to uncover whether the incorporation of these prior informations in the Bayesian VEC Model is coherent when we use models that consider contiguity information.
APA, Harvard, Vancouver, ISO, and other styles
40

Shami, Roland G. (Roland George) 1960. "Bayesian analysis of a structural model with regime switching." Monash University, Dept. of Econometrics and Business Statistics, 2001. http://arrow.monash.edu.au/hdl/1959.1/9277.

Full text
APA, Harvard, Vancouver, ISO, and other styles
41

Ahelegbey, Daniel Felix <1983&gt. "Bayesian graphical models with economic and financial applications." Doctoral thesis, Università Ca' Foscari Venezia, 2015. http://hdl.handle.net/10579/6548.

Full text
Abstract:
Recent advances in empirical finance has seen a considerable amount of research in network econometrics for systemic risk analysis. The network approach aims to identify the key determinants of the structure and stability of the financial system, and the mechanism for systemic risk propagation. This thesis contributes to the literature by presenting a Bayesian graphical approach to model cause and effect relationships in observed data. It contributes specifically to model selection in moderate and high dimensional problems and develops Markov chain Monte Carlo procedures for efficient model estimation. It also provides simulation and empirical applications to model dynamics in macroeconomic variables and financial networks. The contributions are discussed in four self contained chapters. Chapter 2 reviews the literature on network econometrics and presents a Bayesian graph-based approach as an alternative method. Chapter 3 proposes a Bayesian graphical approach to identification in structural vector autoregressive models. Chapter 4 develops a model selection to multivariate time series of large dimension through graphical vector autoregressive models and introducing sparsity on the structure of temporal dependence among the variables. Chapter 5 presents a stochastic framework for financial network models by proposing a hierarchical Bayesian graphical model that can usefully decompose dependencies between financial institutions into linkages between different countries financial systems and linkages between banking institutions, within and/or across countries.
APA, Harvard, Vancouver, ISO, and other styles
42

Rossini, Luca <1987&gt. "Contributions to bayesian nonparametric and objective Bayes literature." Doctoral thesis, Università Ca' Foscari Venezia, 2016. http://hdl.handle.net/10579/10292.

Full text
Abstract:
The thesis contributes to the literature on Bayesian nonparametrics by proposing two approaches, the first one related to time series analysis with a focus on sparsity of the matrix of coefficients and the second one to conditional copula models with an application to twin data. On the other hand, the thesis contributes to the literature on the analysis of the Yule-Simon distribution by proposing two objective priors on the parameter of the distribution and a Gibbs sampling algorithm for the analysis of the posterior distribution.
APA, Harvard, Vancouver, ISO, and other styles
43

Oduro, Samuel Dua. "Bayesian econometric modelling of informed trading, bid-ask spread and volatility." Thesis, University of Kent, 2016. https://kar.kent.ac.uk/61094/.

Full text
Abstract:
Recent developments in global financial markets have increased the need for research aimed at the measurement and possible reduction of liquidity risk. In particular, market crashes have been partly blamed on the sudden withdrawal of liquidity in markets and increases in liquidity risk. To this end, it is important to develop better approaches for inferring or quantifying liquidity risk. Liquidity risk caused by some investors trading on their information advantage (informed trading) has been a subject of market microstructure research in the last few decades. Researchers have employed information-based models that use observed or inferred order flow to investigate this problem. The Probability of Informed Trading (PIN) is a measure which uses inferred order flow to quantify the extent information asymmetry. However, a number of computational issues have been reported to effect the estimation of PIN. Using an alternative methodology, we address the numerical problem associated with the estimation of PIN. Varied evidence of a relationship between volume and bid-ask spread has been documented in the extant literature. In particular, theory suggests that bid-ask spread and volume are jointly driven by a common process as both variables measure an aspect of liquidity. The complex relationship between these variables is time-varying since the informed trading component of order flow changes as trading takes place. Thus, volume and bid-ask spread may provide insight on the time-varying composition of economic agents trading an asset. We exploit the nonlinear relationship between traded volume and bid-ask spread to develop a model that can be used to infer informed and uninformed trading components of volume. The structure of the model and estimation methodology enhances the sequential processing and incorporation of past volume and bid-ask spread as conditioning information. The model is applied to two equities that trade on the New York Stock Exchange. Finally, to increase our understanding on the effects of liquidity risk on volatility, we also examine whether separating volume into informed and uninformed components can provide further insight on the relationship between liquidity risk and volatility.
APA, Harvard, Vancouver, ISO, and other styles
44

Rotaru, Igor <1988&gt. "A Bayesian MS-SUR Model for Forecasting Exchange Rates." Master's Degree Thesis, Università Ca' Foscari Venezia, 2014. http://hdl.handle.net/10579/5207.

Full text
Abstract:
The thesis proposes a new Bayesian factor model in the forecasting exchange rates using an application of Markov chain Monte Carlo to Bayesian inference. First we describe the Zellner's Seemingly Unrelated Regression (SUR) multivariate model with ten macroeconomic fundamentals in order to forecast the six exchange rates over the years 2002-2014. Secondly, we assume a latent Markov switching process is driving the parameters of the SUR model in order to detect structural instabilities. We develop MATLAB code for analysing and forecasting monthly exchange rate series.
APA, Harvard, Vancouver, ISO, and other styles
45

Li, Guangjie. "Essays on economic and econometric applications of Bayesian estimation and model comparison." Thesis, University of Leicester, 2009. http://hdl.handle.net/2381/4792.

Full text
Abstract:
This thesis consists of three chapters on economic and econometric applications of Bayesian parameter estimation and model comparison. The first two chapters study the incidental parameter problem mainly under a linear autoregressive (AR) panel data model with fixed effect. The first chapter investigates the problem from a model comparison perspective. The major finding in the first chapter is that consistency in parameter estimation and model selection are interrelated. The reparameterization of the fixed effect parameter proposed by Lancaster (2002) may not provide a valid solution to the incidental parameter problem if the wrong set of exogenous regressors are included. To estimate the model consistently and to measure its goodness of fit, the Bayes factor is found to be more preferable for model comparson than the Bayesian information criterion based on the biased maximum likelihood estimates. When the model uncertainty is substantial, Bayesian model averaging is recommended. The method is applied to study the relationship between financial development and economic growth. The second chapter proposes a correction function approach to solve the incidental parameter problem. It is discovered that the correction function exists for the linear AR panel model of order p when the model is stationary with strictly exogenous regressors. MCMC algorithms are developed for parameter estimation and to calculate the Bayes factor for model comparison. The last chapter studies how stock return's predictability and model uncertainty affect a rational buy-and-hold investor's decision to allocate her wealth for different lengths of investment horizons in the UK market. The FTSE All-Share Index is treated as the risky asset, and the UK Treasury bill as the riskless asset in forming the investor's portfolio. Bayesian methods are employed to identify the most powerful predictors by accounting for model uncertainty. It is found that though stock return predictability is weak, it can still affect the investor's optimal portfolio decisions over different investment horizons.
APA, Harvard, Vancouver, ISO, and other styles
46

Sarferaz, Samad. "Essays on business cycle analysis and demography." Doctoral thesis, Humboldt-Universität zu Berlin, Wirtschaftswissenschaftliche Fakultät, 2010. http://dx.doi.org/10.18452/16151.

Full text
Abstract:
Diese Arbeit besteht aus vier Essays, die empirische und methodische Beiträge zur Messung von Konjunkturzyklen und deren Zusammenhänge zu demographischen Variablen liefern. Der erste Essay analysiert unter Zuhilfenahme eines Bayesianischen Dynamischen Faktormodelles die Volatilität des US-amerikanischen Konjunkturzyklus seit 1867. In dem Essay wird gezeigt, dass die Volatilität in der Periode vor dem Ersten Weltkrieg und nachdem Zweiten Weltkrieg niedriger war als in der Zwischenkriegszeit. Eine geringere Volatilität für die Periode nach dem Zweiten Weltkrieg im Vergleich zu der Periode vor dem Ersten Weltkrieg kann nicht bestätigt werden. Der zweite Essay hebt die Bayesianischen Eigenschaften bezüglich dynamischer Faktormodelle hervor. Der Essay zeigt, dass die ganze Analyse hindurch - im Gegensatz zu klassischen Ansätzen - keine Annahmen an die Persistenz der Zeitreihen getroffen werden muss. Des Weiteren wird veranschaulicht, wie im Bayesianischen Rahmen die Anzahl der Faktoren bestimmt werden kann. Der dritte Essay entwickelt einen neuen Ansatz, um altersspezifische Sterblichkeitsraten zu modellieren. Kovariate werden mit einbezogen und ihre Dynamik wird gemeinsam mit der von latenten Variablen, die allen Alterklassen zugrunde liegen, modelliert. Die Resultate bestätigen, dass makroökonomische Variablen Prognosekraft für die Sterblichkeit beinhalten. Im vierten Essay werden makroökonomischen Zeitreihen zusammen mit altersspezifischen Sterblichkeitsraten einer strukturellen Analyse unterzogen. Es wird gezeigt, dass sich die Sterblichkeit von jungen Erwachsenen in Abhängigkeit von Konjunkturzyklen deutlich von den der anderen Alterklassen unterscheidet. Daher sollte in solchen Analysen, um Scheinkorrelation vorzubeugen, zwischen den einzelnen Altersklassen differenziert werden.
The thesis consists of four essays, which make empirical and methodological contributions to the fields of business cycle analysis and demography. The first essay presents insights on U.S. business cycle volatility since 1867 derived from a Bayesian dynamic factor model. The essay finds that volatility increased in the interwar periods, which is reversed after World War II. While evidence can be generated of postwar moderation relative to pre-1914, this evidence is not robust to structural change, implemented by time-varying factor loadings. The second essay scrutinizes Bayesian features in dynamic index models. The essay shows that large-scale datasets can be used in levels throughout the whole analysis, without any pre-assumption on the persistence. Furthermore, the essay shows how to determine the number of factors accurately by computing the Bayes factor. The third essay presents a new way to model age-specific mortality rates. Covariates are incorporated and their dynamics are jointly modeled with the latent variables underlying mortality of all age classes. In contrast to the literature, a similar development of adjacent age groups is assured, allowing for consistent forecasts. The essay demonstrates that time series of covariates contain predictive power for age-specific rates. Furthermore, it is observed that in particular parameter uncertainty is important for long-run forecasts, implicating that ignoring parameter uncertainty might yield misleadingly precise predictions. In the fourth essay the model developed in the third essay is utilized to conduct a structural analysis of macroeconomic fluctuations and age-specific mortality rates. The results reveal that the mortality of young adults, concerning business cycles, noticeably differ from the rest of the population. This implies that differentiating closely between particular age classes, might be important in order to avoid spurious results.
APA, Harvard, Vancouver, ISO, and other styles
47

Bianchin, Daniele <1990&gt. "Bayesian Multivariate Autoregressive Gamma Processes: An Application to Realized Volatility." Master's Degree Thesis, Università Ca' Foscari Venezia, 2017. http://hdl.handle.net/10579/10626.

Full text
Abstract:
In this thesis I present the multivariate Autoregressive Gamma process introduced by Le, Singleton and Dai (2010), a model founded on the univariate ARG first introduced in Gourieroux and Jasiak (2006). I discuss its mathematical properties and provide a MCMC algorithm for the Bayesian estimation of the parameters. The gamma process has been used due to its desirable properties in modelling realized volatility, for this reason I evaluate its performance on a panel of realized volatilities for multiple assets.
APA, Harvard, Vancouver, ISO, and other styles
48

Almeida, Vanda Regina Guimarães de. "Bayesian estimation of a DSGE model for the Portuguese economy." Master's thesis, Instituto Superior de Economia e Gestão, 2009. http://hdl.handle.net/10400.5/2775.

Full text
Abstract:
Mestrado em Econometria Aplicada e Previsão
In this paper, a New-Keynesian DSGE model for a small open economy integrated in a monetary union is developed and estimated for the Portuguese economy, using a Bayesian approach. Estimates for some key structural parameters are obtained and a set of exercises exploring the model's statistical and economic properties are performed. A survey on the main events and literature associated with DSGE models that motivated this study is also provided, as well as a comprehensive discussion of the Bayesian estimation and model vali¬dation techniques applied. The model features five types of agents namely households, firms, aggregators, the rest of the world and the government, and includes a number of shocks and frictions, which enable a closer matching of the short-run properties of the data and a more realistic short-term adjustment to shocks. It is assumed from the outset that mone¬tary policy is defined by the union's central bank and that the domestic economy's size is negligible, relative to the union's one, and therefore its specific economic fluctuations have no influence on the union's macroeconomic aggregates and monetary policy. An endogenous risk-premium is considered, allowing for deviations of the domestic economy's interest rate from the union's one. Furthermore it is assumed that all trade and financial flows are per¬formed with countries belonging to the union, which implies that the nominal exchange rate is irrevocably set to unity.
APA, Harvard, Vancouver, ISO, and other styles
49

Simoni, Anna <1980&gt. "Bayesian Analysis of Linear Inverse Problems with Applications in Economics and Finance." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1211/1/Tesi_Anna_Simoni.pdf.

Full text
Abstract:
In my PhD thesis I propose a Bayesian nonparametric estimation method for structural econometric models where the functional parameter of interest describes the economic agent's behavior. The structural parameter is characterized as the solution of a functional equation, or by using more technical words, as the solution of an inverse problem that can be either ill-posed or well-posed. From a Bayesian point of view, the parameter of interest is a random function and the solution to the inference problem is the posterior distribution of this parameter. A regular version of the posterior distribution in functional spaces is characterized. However, the infinite dimension of the considered spaces causes a problem of non continuity of the solution and then a problem of inconsistency, from a frequentist point of view, of the posterior distribution (i.e. problem of ill-posedness). The contribution of this essay is to propose new methods to deal with this problem of ill-posedness. The first one consists in adopting a Tikhonov regularization scheme in the construction of the posterior distribution so that I end up with a new object that I call regularized posterior distribution and that I guess it is solution of the inverse problem. The second approach consists in specifying a prior distribution on the parameter of interest of the g-prior type. Then, I detect a class of models for which the prior distribution is able to correct for the ill-posedness also in infinite dimensional problems. I study asymptotic properties of these proposed solutions and I prove that, under some regularity condition satisfied by the true value of the parameter of interest, they are consistent in a "frequentist" sense. Once I have set the general theory, I apply my bayesian nonparametric methodology to different estimation problems. First, I apply this estimator to deconvolution and to hazard rate, density and regression estimation. Then, I consider the estimation of an Instrumental Regression that is useful in micro-econometrics when we have to deal with problems of endogeneity. Finally, I develop an application in finance: I get the bayesian estimator for the equilibrium asset pricing functional by using the Euler equation defined in the Lucas'(1978) tree-type models.
APA, Harvard, Vancouver, ISO, and other styles
50

Simoni, Anna <1980&gt. "Bayesian Analysis of Linear Inverse Problems with Applications in Economics and Finance." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2009. http://amsdottorato.unibo.it/1211/.

Full text
Abstract:
In my PhD thesis I propose a Bayesian nonparametric estimation method for structural econometric models where the functional parameter of interest describes the economic agent's behavior. The structural parameter is characterized as the solution of a functional equation, or by using more technical words, as the solution of an inverse problem that can be either ill-posed or well-posed. From a Bayesian point of view, the parameter of interest is a random function and the solution to the inference problem is the posterior distribution of this parameter. A regular version of the posterior distribution in functional spaces is characterized. However, the infinite dimension of the considered spaces causes a problem of non continuity of the solution and then a problem of inconsistency, from a frequentist point of view, of the posterior distribution (i.e. problem of ill-posedness). The contribution of this essay is to propose new methods to deal with this problem of ill-posedness. The first one consists in adopting a Tikhonov regularization scheme in the construction of the posterior distribution so that I end up with a new object that I call regularized posterior distribution and that I guess it is solution of the inverse problem. The second approach consists in specifying a prior distribution on the parameter of interest of the g-prior type. Then, I detect a class of models for which the prior distribution is able to correct for the ill-posedness also in infinite dimensional problems. I study asymptotic properties of these proposed solutions and I prove that, under some regularity condition satisfied by the true value of the parameter of interest, they are consistent in a "frequentist" sense. Once I have set the general theory, I apply my bayesian nonparametric methodology to different estimation problems. First, I apply this estimator to deconvolution and to hazard rate, density and regression estimation. Then, I consider the estimation of an Instrumental Regression that is useful in micro-econometrics when we have to deal with problems of endogeneity. Finally, I develop an application in finance: I get the bayesian estimator for the equilibrium asset pricing functional by using the Euler equation defined in the Lucas'(1978) tree-type models.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography