[go: up one dir, main page]

nep-ecm New Economics Papers
on Econometrics
Issue of 2011‒12‒19
nineteen papers chosen by
Sune Karlsson
Orebro University

  1. Marginal Likelihood for Markov-switching and Change-point Garch Models By Luc Luc; Arnaud Dufays; Jeroen V.K. Rombouts
  2. GMM Estimation of Fixed Effects Dynamic Panel Data Models with Spatial Lag and Spatial Errors By Cizek, P.; Jacobs, J.P.A.M.; Ligthart, J.E.; Vrijburg, H.
  3. Of Copulas, Quantiles, Ranks and Spectra - An L1-Approach to Spectral Analysis By Holger Dette; Marc Hallin; Tobias Kley; Stanislav Volgushev
  4. Long Memory Dynamics for Multivariate Dependence under Heavy Tails By Pawel Janus; Siem Jan Koopman; André Lucas
  5. Asymptotic theory of range-based multipower variation By Kim Christensen; Mark Podolskij
  6. Estimation of panel data regression models with two-sided censoring or truncation By Sule Alan; Bo E. Honoré; Luojia Hu; Søren Leth-Petersen
  7. Non-parametric kernel estimation for symmetric Hawkes processes. Application to high frequency financial data By E. Bacry; K. Dayri; J. F. Muzy
  8. Bayesian analysis of coefficient instability in dynamic regressions By Emanuela Ciapanna; Marco Taboga
  9. Asymptotic behaviour of the posterior distribution in overfitted mixture models. By Rousseau, Judith; Mengersen, Kerrie
  10. Parameter Estimation and Forecasting for Multiplicative Lognormal Cascades By Andrés E. Leövey,Thomas Lux
  11. A method to estimate power parameter in Exponential Power Distribution via polynomial regression By Daniele Coin
  12. The estimation of three-dimensional fixed effects panel data models By Matyas, Laszlo; Balazsi, Laszlo
  13. Regular Variation and the Identification of Generalized Accelerated Failure-Time Models By Abbring, J.H.; Ridder, G.
  14. Detecting multiple breaks in long memory: The case of US inflation By Hassler, Uwe; Meller, Barbara
  15. Utility-based Forecast Evaluation with Multiple Decision Rules and a New Maxmin Rule By Manuel Lukas
  16. Skew-normal shocks in the linear state space form DSGE model By Grzegorz Grabek; Bohdan Klos; Grzegorz Koloch
  17. Regression Discontinuity Designs with an Endogenous Forcing Variable and an Application to Contracting in Health Care By Patrick Bajari; Han Hong; Minjung Park; Robert Town
  18. Estimating financial risk using piecewise Gaussian processes By I. Garcia; J. Jimenez
  19. Using the regression discontinuity design with implicit partitions: The impacts of comunidades solidarias rurales on schooling in El Salvador By de Brauw, Alan; Gilligan, Daniel

  1. By: Luc Luc (Université catholique de Louvain, CORE); Arnaud Dufays (Université catholique de Louvain, CORE); Jeroen V.K. Rombouts (Institute of Applied Economics at HEC Montréal, CIRANO, CIRPEE, and CORE)
    Abstract: GARCH volatility models with fixed parameters are too restrictive for long time series due to breaks in the volatility process. Flexible alternatives are Markov-switching GARCH and change-point GARCH models. They require estimation by MCMC methods due to the path dependence problem. An unsolved issue is the computation of their marginal likelihood, which is essential for determining the number of regimes or change-points. We solve the problem by using particle MCMC, a technique proposed by Andrieu, Doucet, and Holenstein (2010). We examine the performance of this new method on simulated data, and we illustrate its use on several return series.
    Keywords: Bayesian inference, Simulation, GARCH, Markov-switching model, Changepoint model, Marginal likelihood, Particle MCMC
    JEL: C11 C15 C22 C58
    Date: 2011–11–24
    URL: http://d.repec.org/n?u=RePEc:aah:create:2011-41&r=ecm
  2. By: Cizek, P.; Jacobs, J.P.A.M.; Ligthart, J.E.; Vrijburg, H. (Tilburg University, Center for Economic Research)
    Abstract: We extend the three-step generalized methods of moments (GMM) approach of Kapoor et al. (2007), which corrects for spatially correlated errors in static panel data models, by introducing a spatial lag and a one-period lag of the dependent variable as additional explanatory variables. Combining the extended Kapoor et al. (2007) approach with the dynamic panel data model GMM estimators of Arellano and Bond (1991) and Blundell and Bond (1998) and specifying moment conditions for various time lags, spatial lags, and sets of exogenous variables yields new spatial dynamic panel data estimators. We prove their consistency and asymptotic normality for a large number of spatial units N and a xed small number of time periods T. Monte Carlo simulations demonstrate that the root mean squared error of spatially corrected GMM estimates|which are based on a spatial lag and spatial error correction|is generally smaller than that of corresponding spatial GMM estimates in which spatial error correlation is ignored. We show that the spatial Blundell-Bond estimators outperform the spatial Arellano-Bond estimators.
    Keywords: Dynamic panel models;spatial lag;spatial error;GMM estimation.
    JEL: C15 C21 C22 C23
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:2011134&r=ecm
  3. By: Holger Dette; Marc Hallin; Tobias Kley; Stanislav Volgushev
    Abstract: In this paper we present an alternative method for the spectral analysis of a strictly stationary time series {Yt}t2Z. We define a “new” spectrum as the Fourier transform of the differences between copulas of the pairs (Yt, Yt−k) and the independence copula. This object is called copula spectral density kernel and allows to separate marginal and serial aspects of a time series. We show that it is intrinsically related to the concept of quantile regression. Like in quantile regression, which provides more information about the conditional distribution than the classical location-scale model, the copula spectral density kernel is more informative than the spectral density obtained from the autocovariances. In particular the approach provides a complete description of the distributions of all pairs (Yt, Yt−k). Moreover, it inherits the robustness properties of classical quantile regression, because it does not require any distributional assumptions such as the existence of finite moments. In order to estimate the copula spectral density kernel we introduce rank-based Laplace periodograms which are calculated as bilinear forms of weighted L1-projections of the ranks of the observed time series onto a harmonic regression model. We establish the asymptotic distribution of those periodograms, and the consistency of adequately smoothed versions. The finite-sample properties of the new methodology, and its potential for applications are briefly investigated by simulations and a short empirical example.
    Keywords: Time series; spectral analysis; periodogram; quantile regression; copulas; ranks; time reversibility
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/104763&r=ecm
  4. By: Pawel Janus (VU University Amsterdam); Siem Jan Koopman (VU University Amsterdam); André Lucas (VU University Amsterdam)
    Abstract: We develop a new simultaneous time series model for volatility and dependence with long memory (fractionally integrated) dynamics and heavy-tailed densities. Our new multivariate model accounts for typical empirical features in financial time series while being robust to outliers or jumps in the data. In the empirical study for four Dow Jones equities, we find that the degree of memory in the volatilities of the equity return series is similar, while the degree of memory in correlations between the series varies significantly. The forecasts from our model are compared with high-frequency realised volatility and dependence measures. The forecast accuracy is overall higher compared to those from some well-known competing benchmark models.
    Keywords: fractional integration; correlation; Student's t copula; time-varying dependence; multivariate volatility
    JEL: C10 C22 C32 C51
    Date: 2011–12–12
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20110175&r=ecm
  5. By: Kim Christensen (Aarhus University and CREATES); Mark Podolskij (University of Heidelberg and CREATES)
    Abstract: In this paper, we present a realised range-based multipower variation theory, which can be used to estimate return variation and draw jump-robust inference about the diffusive volatility component, when a high-frequency record of asset prices is available. The standard range-statistic – routinely used in financial economics to estimate the variance of securities prices – is shown to be biased when the price process contains jumps. We outline how the new theory can be applied to remove this bias by constructing a hybrid range-based estimator. Our asymptotic theory also reveals that when high-frequency data are sparsely sampled, as is often done in practice due to the presence of microstructure noise, the range-based multipower variations can produce significant efficiency gains over comparable subsampled returnbased estimators. The analysis is supported by a simulation study and we illustrate the practical use of our framework on some recent TAQ equity data.
    Keywords: High-frequency data, Integrated variance, Realised multipower variation, Realised range-basedmultipower variation, Quadratic variation.
    JEL: C10 C80
    Date: 2011–10–30
    URL: http://d.repec.org/n?u=RePEc:aah:create:2011-47&r=ecm
  6. By: Sule Alan; Bo E. Honoré; Luojia Hu; Søren Leth-Petersen
    Abstract: This paper constructs estimators for panel data regression models with individual specific heterogeneity and two-sided censoring and truncation. Following Powell (1986) the estimation strategy is based on moment conditions constructed from re-censored or re-truncated residuals. While these moment conditions do not identify the parameter of interest, they can be used to motivate objective functions that do. We apply one of the estimators to study the effect of a Danish tax reform on household portfolio choice. The idea behind the estimators can also be used in a cross sectional setting.
    Keywords: Regression analysis
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:fip:fedhwp:wp-2011-08&r=ecm
  7. By: E. Bacry; K. Dayri; J. F. Muzy
    Abstract: We define a numerical method that provides a non-parametric estimation of the kernel shape in symmetric multivariate Hawkes processes. This method relies on second order statistical properties of Hawkes processes that relate the covariance matrix of the process to the kernel matrix. The square root of the correlation function is computed using a minimal phase recovering method. We illustrate our method on some examples and provide an empirical study of the estimation errors. Within this framework, we analyze high frequency financial price data modeled as 1D or 2D Hawkes processes. We find slowly decaying (power-law) kernel shapes suggesting a long memory nature of self-excitation phenomena at the microstructure level of price dynamics.
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1112.1838&r=ecm
  8. By: Emanuela Ciapanna (Bank of Italy); Marco Taboga (Bank of Italy)
    Abstract: This paper proposes a Bayesian regression model with time-varying coefficients (TVC) that makes it possible to estimate jointly the degree of instability and the time-path of regression coefficients. Thanks to its computational tractability, the model proves suitable to perform the first (to our knowledge) Monte Carlo study of the finite-sample properties of a TVC model. Under several specifications of the data generating process, the proposed model’s estimation precision and forecasting accuracy compare favourably with those of other methods commonly used to deal with parameter instability. Furthermore, the TVC model leads to small losses of efficiency under the null of stability and it is robust to mis-specification, providing a satisfactory performance also when regression coefficients experience discrete structural breaks. As a demonstrative application, we use our TVC model to estimate the exposures of S&P 500 stocks to market-wide risk factors: we find that a vast majority of stocks have time-varying risk exposures and that the TVC model helps to forecast these exposures more accurately.
    Keywords: time-varying regression, coefficient instability
    JEL: C11 C32 C50
    Date: 2011–11
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_836_11&r=ecm
  9. By: Rousseau, Judith; Mengersen, Kerrie
    Abstract: In this paper we study the asymptotic behaviour of the posterior distribution in a mixture model when the number of components in the mixture is larger than the true number of components, a situation commonly referred to as overfitted mixture. We prove in particular that quite generally the posterior distribution has a stable and interesting behaviour, since it tends to empty the extra components. This stability is achieved under some restriction on the prior, which can be used as a guideline for choosing the prior. Some simulations are presented to illustrate this behaviour.
    Keywords: posterior concentration; mixture models; overfitting; Asymptotic; Bayesian;
    JEL: C11
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ner:dauphi:urn:hdl:123456789/4648&r=ecm
  10. By: Andrés E. Leövey,Thomas Lux
    Abstract: We study the well-known multiplicative Lognormal cascade process in which the multiplication of Gaussian and Lognormally distributed random variables yields time series with intermittent bursts of activity. Due to the non-stationarity of this process and the combinatorial nature of such a formalism, its parameters have been estimated mostly by fitting the numerical approximation of the associated non-Gaussian pdf to empirical data, cf. Castaing et al. [Physica D, 46, 177 (1990)]. More recently, an alternative estimator based upon qth order absolute moments has been introduced by Kiyono et al. [Phys. Rev. E 76 41113 (2007)]. In this paper, we pursue this moment-based approach further and develop a more rigorous Generalized Method of Moments (GMM) estimation procedure to cope with the documented difficulties of previous methodologies. We show that even under uncertainty about the actual number of cascade steps, our methodology yields very reliable results for the estimated intermittency parameter. Employing the Levinson-Durbin algorithm for best linear forecasts, we also show that estimated parameters can be used for forecasting the evolution of the turbulent flow. We compare forecasting results from the GMM and Kiyono et al.'s procedure via Monte Carlo simulations. We finally test the applicability of our approach by estimating the intermittency parameter and forecasting of volatility for a sample of financial data from stock and foreign exchange markets
    Keywords: Random Lognormal cascades, GMM estimation, best linear forecasting, volatility of financial returns
    JEL: C20 G12
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:kie:kieliw:1746&r=ecm
  11. By: Daniele Coin (Bank of Italy)
    Abstract: The Exponential Power Distribution (EPD), also known as Generalized Error Distribution (GED), is a flexible symmetrical unimodal family belonging to the exponential family. The EPD becomes the density function of a range of symmetric distributions with different values of its power parameter B. A closed-form estimator for B does not exist, so the power parameter is usually estimated numerically. Unfortunately the optimization algorithms do not always converge, especially when the true value of B is close to its parametric space frontier. In this paper we present an alternative method for estimating B, based on the Normal Standardized Q-Q Plot and exploiting the relationship between B and the kurtosis. It is a direct method that does not require computational efforts or the use of optimization algorithms.
    Keywords: Exponential Power Distribution, kurtosis, normal standardized Q-Q plot.
    JEL: C14 C15 C63
    Date: 2011–11
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_834_11&r=ecm
  12. By: Matyas, Laszlo; Balazsi, Laszlo
    Abstract: The paper introduces for the most frequently used three-dimensional fixed effects panel data models the appropriate Within estimators. It analyzes the behaviour of these estimators in the case of no-self-flow data, unbalanced data and dynamic autoregressive models.
    Keywords: panel data; unbalanced panel; dynamic panel data model; multidimensional panel data; fixed effects; trade models; gravity models; FDI
    JEL: C13 F17 C23 F47
    Date: 2011–12–12
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:34976&r=ecm
  13. By: Abbring, J.H.; Ridder, G. (Tilburg University, Center for Economic Research)
    Abstract: Ridder (1990) provides an identification result for the Generalized Accelerated Failure-Time (GAFT) model. We point out that Ridder's proof of this result is incomplete, and provide an amended proof with an additional necessary and sufficient condition that requires that a function varies regularly at 0 and 1. We also give more readily interpretable sufficient conditions on the tails of the error distribution or the asymptotic behavior of the transformation of the dependent variable. The sufficient conditions are shown to encompass all previous results on the identification of the Mixed Proportional Hazards (MPH) model. Thus, this paper not only clarifies, but also unifies the literature on the non-parametric identification of the GAFT and MPH models.
    Keywords: duration analysis;identifiability;Mixed Proportional Hazards model;regular variation.
    JEL: C14 C41
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:2011135&r=ecm
  14. By: Hassler, Uwe; Meller, Barbara
    Abstract: Multiple structural change tests by Bei and Perron (1998) are applied to the regression by Demetrescu, Kuzin and Hassler (2008) in order to detect breaks in the order of fractional integration. With this instrument we tackle time-varying inflation persistence as an important issue for monetary policy. We determine not only the location and significance of breaks in persistence, but also the number of breaks. Only one significant break in U.S. inflation persistence (measured by the long-memory parameter) is found to have taken place in 1973, while a second break in 1980 is not significant. --
    Keywords: Fractional integration,break in persistence,unknown break point,inflation dynamics
    JEL: C22 E31
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdp1:201126&r=ecm
  15. By: Manuel Lukas (Aarhus University and CREATES)
    Abstract: In this paper we generalize the existing approach to utility-based evaluation of density forecast models by allowing for multiple decision rules. In the generalized approach forecast models and decision rules can only be evaluated jointly. We show how to conduct the joint evaluation and explore to which extent conclusions about either forecast models or decision rules are possible. As a specic decision rule we introduce a Gilboa-Schmeidler (1989) type multiple-prior maxmin decision rule, where we use the model condence set of Hansen, Lunde, and Nason (2011) as priors. In an empirical application, the density forecasts of ve GARCH-type models are combined with this maxmin rule and other decision rules for static portfolio choice with daily data on the S&P500.
    Keywords: Decision rules, forecast evaluation and comparison, maxmin, ambiguity aversion, portfolio choice.
    JEL: C44 C53 D81 G11
    Date: 2011–11–26
    URL: http://d.repec.org/n?u=RePEc:aah:create:2011-42&r=ecm
  16. By: Grzegorz Grabek (National Bank of Poland, Economic Institute); Bohdan Klos (National Bank of Poland, Economic Institute); Grzegorz Koloch (National Bank of Poland, Economic Institute)
    Abstract: Observed macroeconomic data – notably GDP growth rate, inflation and interest rates – can be, and usually are skewed. Economists attempt to fit models to data by matching first and second moments or co-moments, but skewness is usually neglected. It is so probably because skewness cannot appear in linear (or linearized) models with Gaussian shocks, and shocks are usually assumed to be Gaussian. Skewness requires non-linearities or non-Gaussian shocks. In this paper we introduce skewness into the DSGE framework assuming skewed normal distribution for shocks while keeping the model linear (or linearized). We argue that such a skewness can be perceived as structural, since it concerns the nature of structural shocks. Importantly, the skewed normal distribution nests the normal one, so that skewness is not assumed, but only allowed for. We derive elementary facts about skewness propagation in the state space model and, using the well-known Lubik-Schorfheide model, we run simulations to investigate how skewness propagates from shocks to observables in a standard DSGE model. We also assess properties of an ad hoc two-steps estimator of models’ parameters, shocks’ skewness parameters among them.
    JEL: C12 C13 C16 D58 E32
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:nbp:nbpmis:101&r=ecm
  17. By: Patrick Bajari; Han Hong; Minjung Park; Robert Town
    Abstract: Regression discontinuity designs (RDDs) are a popular method to estimate treatment effects. However, RDDs may fail to yield consistent estimates if the forcing variable can be manipulated by the agent. In this paper, we examine one interesting set of economic models with such a feature. Specifically, we examine the case where there is a structural relationship between the forcing variable and the outcome variable because they are determined simultaneously. We propose a modi…ed RDD estimator for such models and derive the conditions under which it is consistent. As an application of our method, we study contracts between a large managed care organization and leading hospitals for the provision of organ and tissue transplants. Exploiting "donut holes" in the reimbursement contracts we estimate how the total claims filed by the hospitals depend on the generosity of the reimbursement structure. Our results show that hospitals submit significantly larger bills when the reimbursement rate is higher, indicating informational asymmetries between the payer and hospitals in this market.
    JEL: D82 I10 L14
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:17643&r=ecm
  18. By: I. Garcia; J. Jimenez
    Abstract: We present a computational method for measuring financial risk by estimating the Value at Risk and Expected Shortfall from financial series. We have made two assumptions: First, that the predictive distributions of the values of an asset are conditioned by information on the way in which the variable evolves from similar conditions, and secondly, that the underlying random processes can be described using piecewise Gaussian processes. The performance of the method was evaluated by using it to estimate VaR and ES for a daily data series taken from the S&P500 index and applying a backtesting procedure recommended by the Basel Committee on Banking Supervision. The results indicated a satisfactory performance.
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1112.2889&r=ecm
  19. By: de Brauw, Alan; Gilligan, Daniel
    Abstract: Regression discontinuity design (RDD) is a useful tool for evaluating programs when a single variable is used to determine program eligibility. RDD has also been used to evaluate programs when eligibility is based on multiple variables that have been aggregated into a single index using explicit, often arbitrary, weights. In this paper, we show that under specific conditions, regression discontinuity can be used in instances when more than one variable is used to determine eligibility, without assigning explicit weights to map those variables into a single measure.
    Keywords: Regression discontinuity design, partitioned cluster analysis, Schooling, Impact evaluation,
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:fpr:ifprid:1116&r=ecm

This nep-ecm issue is ©2011 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.