[go: up one dir, main page]

nep-ecm New Economics Papers
on Econometrics
Issue of 2019‒04‒15
eighteen papers chosen by
Sune Karlsson
Örebro universitet

  1. Doubly Robust-type Estimation of Population Moments and Parameters in Biased Sampling By Takahiro Hoshino; Yuya Shimizu
  2. A PRIMER ON BOOTSTRAP TESTING OF HYPOTHESES IN TIME SERIES MODELS: WITH AN APPLICATION TO DOUBLE AUTOREGRESSIVE MODELS By Giuseppe Cavaliere; Anders Rahbek
  3. NEAREST COMOMENT ESTIMATION WITH UNOBSERVED FACTORS By Kris Boudt; Dries Cornilly; Tim Verdonck
  4. Testing independence of covariates and errors in nonparametric regression By Sankar, Subhra; Bergsma, Wicher; Dassios, Angelos
  5. Tests for conditional heteroscedasticity with functional data and goodness-of-fit tests for FGARCH models By Rice, Gregory; Wirjanto, Tony; Zhao, Yuqian
  6. Binary Choice Models with High-Dimensional Individual and Time Fixed Effects By Daniel Czarnowske; Amrei Stammann
  7. Bayesian Estimation of Mixed Multinomial Logit Models: Advances and Simulation-Based Evaluations By Prateek Bansal; Rico Krueger; Michel Bierlaire; Ricardo A. Daziano; Taha H. Rashidi
  8. Jackknife Empirical Likelihood for Inequality Constraints on Regular Functionals By Chen, Ruxin; Tabri, Rami V.
  9. Local Polynomial Estimation of Time-Varying Parameters in Nonlinear Models By Dennis Kristensen; Young Jun Lee
  10. Estimation and inference in spatial models with dominant units By M. Hashem Pesaran; Cynthia Fan Yang
  11. Deconstructing the yield curve By Crump, Richard K.; Gospodinov, Nikolay
  12. Tests of Conditional Predictive Ability: Some Simulation Evidence By McCracken, Michael W.
  13. Estimating Impulse Response Functions When the Shock Series Is Observed By Choi, Chi-Young; Chudik, Alexander
  14. The Endo-Exo Problem in High Frequency Financial Price Fluctuations and Rejecting Criticality By Spencer Wheatley; Alexander Wehrli; Didier Sornette
  15. Bayesian prediction of jumps in large panels of time series data By Angelos Alexopoulos; Petros Dellaportas; Omiros Papaspiliopoulos
  16. Microeconometric Dynamic Panel Data Methods: Model Specification and Selection Issues By Kiviet, Jan
  17. The relationship between property transaction prices, turnover rates and buyers' and sellers' reservation price distributions By Gunnelin, Åke; Netzell, Olof
  18. What Do Sectoral Dynamics Tell Us About the Origins of Business Cycles? By Matthes, Christian; Schwartzman, Felipe

  1. By: Takahiro Hoshino (Faculty of Ecoomics, Keio University); Yuya Shimizu (Graduate School of Economics, Keio University)
    Abstract: We propose an estimation method of population moments or population parameters in "biased sampling data" in which for some units of data, not only the variable of interest but also the covariates, have missing observations and the proportion of "missingness" is unknown. We use auxiliary information such as the distribution of covariates or their moments in random sampling data in order to correct the bias. Moreover, with additional assumptions, we can correct the bias even if we have only the moment information of covariates. The main contribution of this paper is the development of a doubly robust-type estimator for biased sampling data. This method provides a consistent estimator if either the regression function or the assignment mechanism is correctly specified. We prove the consistency and semi-parametric efficiency of the doubly robust estimator. Both the simulation and empirical application results demonstrate that the proposed estimation method is more robust than existing methods.
    Keywords: Auxiliary information, Biased sampling, Missing data, Propensity score, Doubly robust estimator
    JEL: C13 C18 C83
    Date: 2019–02–11
    URL: http://d.repec.org/n?u=RePEc:keo:dpaper:2019-006&r=all
  2. By: Giuseppe Cavaliere (Department of Economics, University of Bologna, Italy); Anders Rahbek (Department of Economics, University of Copenhagen, Denmark)
    Abstract: In this paper we discuss the general application of the bootstrap as a tool for statistical inference in econometric time series models. We do this by considering the implementation of bootstrap inference in the class of double-autoregressive [DAR] models discussed in Ling (2004). DAR models are particularly interesting to illustrate implementation of the bootstrap to time series: first, standard asymptotic inference is usually difficult to implement due to the presence of nuisance parameters under the null hypothesis; second, inference involves testing whether one or more parameters are on the boundary of the parameter space; third, under the alternative hypothesis, fourth or even second order moments may not exist. In most of these cases, the bootstrap is not considered an appropriate tool for inference. Conversely, and taking testing (non-) stationarity to illustrate, we show that although a standard bootstrap based on unrestricted parameter estimation is invalid, a correct implementation of a bootstrap based on restricted parameter estimation (restricted bootstrap) is first-order valid; that is, it is able to replicate, under the null hypothesis, the correct limiting null distribution. Importantly, we also show that the behaviour of this bootstrap under the alternative hypothesis may be different because of possible lack of finite second-order moments of the bootstrap innovations. This features makes - for some parameter configurations - the restricted bootstrap unable to replicate the null asymptotic distribution when the null is false. We show that this drawback can be fixed by using a new 'hybrid' bootstrap, where the parameter estimates used to construct the bootstrap data are obtained with the null imposed, while the bootstrap innovations are sampled with replacement from the unrestricted residuals. We show that this bootstrap, novel in this framework, mimics the correct asymptotic null distribution, irrespetively of the null to be true or false. Throughout the paper, we use a number of examples from the bootstrap time series literature to illustrate the importance of properly defining and analyzing the bootstrap generating process and associated bootstrap statistics.
    Keywords: Bootstrap; Hypothesis testing; Double-Autoregressive models; Parameter on the boundary; Infinite Variance
    JEL: C32
    Date: 2019–04–02
    URL: http://d.repec.org/n?u=RePEc:kud:kuiedp:1903&r=all
  3. By: Kris Boudt; Dries Cornilly; Tim Verdonck (-)
    Abstract: We propose a minimum distance estimator for the higher-order comoments of a multivariate distribution exhibiting a lower dimensional latent factor structure. We derive the in uence function of the proposed estimator and prove its consistency and asymptotic normality. The simulation study confirms the large gains in accuracy compared to the traditional sample comoments. The empirical usefulness of the novel framework is shown in applications to portfolio allocation under non-Gaussian objective functions and to the extraction of factor loadings in a dataset with mental ability scores.
    Keywords: Higher-order multivariate moments; latent factor model; minimum distance estimation; risk assessment; structural equation modelling.
    JEL: C10 C13 C51
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:rug:rugwps:19/970&r=all
  4. By: Sankar, Subhra; Bergsma, Wicher; Dassios, Angelos
    Abstract: Consider a nonparametric regression model Y = m(X)+✏, where m is an unknown regression function, Y is a real-valued response variable, X is a real co-variate, and ✏ is the error term. In this article, we extend the usual tests for homoscedasticity by developing consistent tests for independence between X and ✏. Further, we investigate the local power of the proposed tests using Le Cam’s contiguous alternatives. An asymptotic power study under local alternatives along with extensive finite sample simulation study shows the performance of the new tests is competitive with existing ones. Furthermore, the practicality of the new tests is shown using two real data sets.
    Keywords: asymptotic power; contiguous alternatives; distance covariance; kendall’s tau; nonparametric regression model; measure of association
    JEL: C1
    Date: 2017–10–02
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:83780&r=all
  5. By: Rice, Gregory; Wirjanto, Tony; Zhao, Yuqian
    Abstract: Functional data objects that are derived from high-frequency financial data often exhibit volatility clustering characteristic of conditionally heteroscedastic time series. Versions of functional generalized autoregressive conditionally heteroscedastic (FGARCH) models have recently been proposed to describe such data, but so far basic diagnostic tests for these models are not available. We propose two portmanteau type tests to measure conditional heteroscedasticity in the squares of financial asset return curves. A complete asymptotic theory is provided for each test, and we further show how they can be applied to model residuals in order to evaluate the adequacy, and aid in order selection of FGARCH models. Simulation results show that both tests have good size and power to detect conditional heteroscedasticity and model mis-specification in finite samples. In an application, the proposed tests reveal that intra-day asset return curves exhibit conditional heteroscedasticity. Additionally, we found that this conditional heteroscedasticity cannot be explained by the magnitude of inter-daily returns alone, but that it can be adequately modeled by an FGARCH(1,1) model.
    Keywords: Functional time series, Heteroscedasticity testing, Model diagnostic checking, High-frequency volatility models, Intra-day asset price
    JEL: C12 C32 C58 G10
    Date: 2019–03–31
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:93048&r=all
  6. By: Daniel Czarnowske; Amrei Stammann
    Abstract: Empirical economists are often deterred from the application of binary choice models with fixed effects mainly for two reasons: the incidental parameter bias and the computational challenge in (moderately) large data sets. We show how both issues can be alleviated in the context of binary choice models with individual and time fixed effects. Thanks to several bias-corrections proposed by Fernandez-Val and Weidner (2016), the incidental parameter bias can be reduced substantially. In order to make the estimation feasible even in panels with many fixed effects, we develop an efficient software routine, embedded in the R -package alpaca, that combines these corrections with an approach called method of alternating projections. Further, we contribute to the existing literature by conducting extensive simulation experiments in large and even unbalanced panel settings. Finally, we estimate a dynamic probit model, to study the inter-temporal labor force participation of women in Germany.
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1904.04217&r=all
  7. By: Prateek Bansal; Rico Krueger; Michel Bierlaire; Ricardo A. Daziano; Taha H. Rashidi
    Abstract: Variational Bayes (VB) methods have emerged as a fast and computationally-efficient alternative to Markov chain Monte Carlo (MCMC) methods for Bayesian estimation of mixed multinomial logit (MMNL) models. It has been established that VB is substantially faster than MCMC at practically no compromises in predictive accuracy. In this paper, we address two critical gaps concerning the usage and understanding of VB for MMNL. First, extant VB methods are limited to utility specifications involving only individual-specific taste parameters. Second, the finite-sample properties of VB estimators and the relative performance of VB, MCMC and maximum simulated likelihood estimation (MSLE) are not known. To address the former, this study extends several VB methods for MMNL to admit utility specifications including both fixed and random utility parameters. To address the latter, we conduct an extensive simulation-based evaluation to benchmark the extended VB methods against MCMC and MSLE in terms of estimation times, parameter recovery and predictive accuracy. The results suggest that all VB variants perform as well as MCMC and MSLE at prediction and recovery of all model parameters with the exception of the covariance matrix of the multivariate normal mixing distribution. In particular, VB with nonconjugate variational message passing and the delta-method (VB-NCVMP-Delta) is relatively accurate and up to 15 times faster than MCMC and MSLE. On the whole, VB-NCVMP-Delta is most suitable for applications in which fast predictions are paramount, while MCMC should be preferred in applications in which accurate inferences are most important.
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1904.03647&r=all
  8. By: Chen, Ruxin; Tabri, Rami V.
    Abstract: Empirical likelihood is effective in many different practical situations involving moment equality and/or inequality restrictions. However, in applications with nonlinear functionals of the underlying distribution, it becomes computationally more difficult to implement. We propose the use of jackknife empirical likelihood (Jing et al., 2009) to circumvent the computational difficulties with nonlinear inequality constraints and establish the chi-bar-square distribution as the limiting null distribution of the resulting empirical likelihood-ratio statistic, where a finite number of inequalities on functionals that are regular, in the sense of Hoeffding (1948), defines the null hypothesis. The class of regular functionals includes many nonlinear functionals that arise in practice and has moments as a special case. To overcome the implementation challenges with this non-pivotal asymptotic null distribution, we propose an empirical likelihood bootstrap procedure that is valid with uniformity. Finally, we investigate the finite-sample properties of the bootstrap procedure using Monte Carlo simulations and find that the results are promising.
    Keywords: Jackknife Empirical Likelihood; Bootstrap Test; Inequality Restrictions; U-statistics
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:syd:wpaper:2019-07&r=all
  9. By: Dennis Kristensen; Young Jun Lee
    Abstract: We develop a novel asymptotic theory for local polynomial (quasi-) maximum-likelihood estimators of time-varying parameters in a broad class of nonlinear time series models. Under weak regularity conditions, we show the proposed estimators are consistent and follow normal distributions in large samples. Our conditions impose weaker smoothness and moment conditions on the data-generating process and its likelihood compared to existing theories. Furthermore, the bias terms of the estimators take a simpler form. We demonstrate the usefulness of our general results by applying our theory to local (quasi-)maximum-likelihood estimators of a time-varying VAR's, ARCH and GARCH, and Poisson autogressions. For the first three models, we are able to substantially weaken the conditions found in the existing literature. For the Poisson autogression, existing theories cannot be be applied while our novel approach allows us to analyze it.
    Date: 2019–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1904.05209&r=all
  10. By: M. Hashem Pesaran; Cynthia Fan Yang
    Abstract: Estimation and inference in the spatial econometrics literature are carried out assuming that the matrix of spatial or network connections has uniformly bounded absolute column sums in the number of cross-section units, n. In this paper, we consider spatial models where this restriction is relaxed. The linear-quadratic central limit theorem of Kelejian and Prucha (2001) is generalized and then used to establish the asymptotic properties of the GMM estimator due to Lee (2007) in the presence of dominant units. A new Bias-Corrected Method of Moments estimator is also proposed that avoids the problem of weak instruments by self-instrumenting the spatially lagged dependent variable. Both estimators are shown to be consistent and asymptotically normal, depending on the rate at which the maximum column sum of the weights matrix rises with n. The small sample properties of the estimators are investigated by Monte Carlo experiments and shown to be satisfactory. An empirical application to sectoral price changes in the US over the pre- and post-2008 financial crisis is also provided. It is shown that the share of capital can be estimated reasonably well from the degree of sectoral interdependence using the input-output tables, despite the evidence of dominant sectors being present in the US economy.
    Keywords: spatial autoregressive models, central limit theorems for linear-quadratic forms, dominant units, GMM, bias-corrected method of moments (BMM), US input-output analysis, capital share
    JEL: C13 C21 C23 R15
    Date: 2019
    URL: http://d.repec.org/n?u=RePEc:ces:ceswps:_7563&r=all
  11. By: Crump, Richard K. (Federal Reserve Bank of New York); Gospodinov, Nikolay (Federal Reserve Bank of Atlanta)
    Abstract: We investigate the factor structure of the term structure of interest rates and argue that characterizing the minimal dimension of the data-generating process is more challenging than currently appreciated. To circumvent these difficulties, we introduce a novel nonparametric bootstrap that is robust to general forms of time and cross-sectional dependence and conditional heteroskedasticity of unknown form. We show that our bootstrap procedure is asymptotically valid and exhibits excellent finite-sample properties in simulations. We demonstrate the applicability of our results in two empirical exercises: First, we show that measures of equity market tail risk and the state of the macroeconomy predict bond returns beyond the level or slope of the yield curve; second, we provide a bootstrap-based bias correction and confidence intervals for the probability of recession based on the shape of the yield curve. Our results apply more generally to all assets with a finite maturity structure.
    Keywords: term structure of interest rates; factor models; principal components; bond risk premiums; resampling-based inference
    Date: 2019–04–01
    URL: http://d.repec.org/n?u=RePEc:fip:fednsr:884&r=all
  12. By: McCracken, Michael W. (Federal Reserve Bank of St. Louis)
    Abstract: In this note we provide simulation evidence on the size and power of tests of predictive ability described in Giacomini and White (2006). Our goals are modest but non-trivial. First, we establish that there exist data generating processes that satisfy the null hypotheses of equal finite-sample (un)conditional predictive ability. We then consider various parameterizations of these DGPs as a means of evaluating the size and power properties of the proposed tests. While some of our results reinforce those in Giacomini and White (2006), others do not. We recommend against using the fixed scheme when conducting these tests and provide evidence that very large bandwidths are sometimes required when estimating long-run variances.
    Keywords: prediction; out-of-sample; inference
    JEL: C12 C52 C53
    Date: 2019–03–01
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2019-011&r=all
  13. By: Choi, Chi-Young (University of Texas at Arlington); Chudik, Alexander (Federal Reserve Bank of Dallas)
    Abstract: We compare the finite sample performance of a variety of consistent approaches to estimating Impulse Response Functions (IRFs) in a linear setup when the shock of interest is observed. Although there is no uniformly superior approach, iterated approaches turn out to perform well in terms of root mean-squared error (RMSE) in diverse environments and sample sizes. For smaller sample sizes, parsimonious specifications are preferred over full specifications with all ‘relevant’ variables.
    Keywords: Observed shock; Impulse-response functions; Monte Carlo experiments; Finite sample performance
    JEL: C13 C50
    Date: 2019–03–04
    URL: http://d.repec.org/n?u=RePEc:fip:feddgw:353&r=all
  14. By: Spencer Wheatley (ETH Zurich); Alexander Wehrli (ETH Zurich); Didier Sornette (ETH Zürich - Department of Management, Technology, and Economics (D-MTEC); Swiss Finance Institute)
    Abstract: The endo-exo problem lies at the heart of statistical identification in many fields of science, and is often plagued by spurious strong-and-long memory due to improper treatment of trends, shocks and shifts in the data. A class of models that has shown to be useful in discerning exogenous and endogenous activity is the Hawkes process. This class of point processes has enjoyed great recent popularity and rapid development within the quantitative finance literature, with particular focus on the study of market microstructure and high frequency price fluctuations. We show that there are important lessons from older fields like time series and econometrics that should also be applied in financial point process modelling. In particular, we emphasize the importance of appropriately treating trends and shocks for the identification of the strength and length of memory in the system. We exploit the powerful Expectation Maximization (EM) algorithm and objective statistical criteria (BIC) to select the flexibility of the deterministic background intensity. With these methods, we strongly reject the hypothesis that the considered financial markets are critical at univariate and bivariate microstructural levels.
    Keywords: mid-price changes, trade times, Hawkes process, endogeneity, criticality, Expectation- Maximization, BIC, non-stationarity, ARMA point process, spurious inference, external shocks
    JEL: C01 C40 C52
    Date: 2018–08
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp1857&r=all
  15. By: Angelos Alexopoulos; Petros Dellaportas; Omiros Papaspiliopoulos
    Abstract: We take a new look at the problem of disentangling the volatility and jumps processes in a panel of stock daily returns. We first provide an efficient computational framework that deals with the stochastic volatility model with Poisson-driven jumps in a univariate scenario that offers a competitive inference alternative to the existing implementation tools. This methodology is then extended to a large set of stocks in which it is assumed that the unobserved jump intensities of each stock co-evolve in time through a dynamic factor model. A carefully designed sequential Monte Carlo algorithm provides out-of-sample empirical evidence that our suggested model outperforms, with respect to predictive Bayes factors, models that do not exploit the panel structure of stocks.
    Date: 2019–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1904.05312&r=all
  16. By: Kiviet, Jan
    Abstract: A motivated strategy is presented to find step by step an adequate model specification and a matching set of instrumental variables by applying the programming tools provided by the Stata package Xtabond2. The aim is to implement generalized method of moment techniques such that useful and reasonably accurate inferences are extracted from an observational panel data set on a single microeconometric structural presumably dynamic behavioral relationship. In the suggested specification search three comprehensive heavily interconnected goals are pursued, namely: (i) to include all the relevant appropriately transformed possibly lagged regressors, as well as any interactions between these if it is required to relax the otherwise very strict homogeneity restrictions on the dynamic impacts of the explanatories in standard linear panel data models; (ii) to correctly classify all regressors as either endogenous, predetermined or exogenous, as well as being either effect-stationary or effect-nonstationary, implying which internal variables could represent valid and relatively strong instruments; (iii) to enhance the accuracy of inference in finite samples by omitting irrelevant regressors and by profitably reducing the space spanned by the full set of available internal instruments. For the various tests which trigger the decisions to be made in the sequential selection process the relevant considerations are spelled out to interpret the magnitude of p-values. Also the complexities to establish and interpret the ultimately established dynamic impacts are explained. Finally the developed strategy is applied to a classic data set and is shown to yield new insights.
    Keywords: classification of explanatories; dynamic impacts; interactions; feedback mechanisms; generalized method of moments; labor demand; model building strategy; short panels.
    JEL: C18 C23 C26 C52 C81 C87 J23
    Date: 2019–04–05
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:93147&r=all
  17. By: Gunnelin, Åke (Department of Real Estate and Construction Management, Royal Institute of Technology); Netzell, Olof (National Board of Housing, Building and Planning)
    Abstract: This paper analyzes the relationship between movements in property transaction prices and movements in the underlying reservation price distributions of buyers and sellers and how these movements are linked to time varying turnover rate. A main conclusion in previous research is that transaction prices lag changes in buyers’ reservation price distribution and that an index tracking transaction prices is less volatile than an index tracking buyer reserves. We show that our less restrictive model of search and price formation reverses the volatility result in previous papers in realistic scenarios, i.e., transaction prices may be more volatile than underlying buyer reserves. We model transaction prices and turnover rates as functions of the moments of buyers’ and sellers’ reservation price distributions, the search intensity and the average bargaining power among buyers and sellers respectively. We derive the probability density function of transaction prices as a function of these parameters and hence a Maximum-likelihood estimator of the parameters, which serves as a new method of estimating indexes tracking movements in reservation price distributions from transaction data. We perform simulations where we show that the Maximum-likelihood estimator works as intended.
    Keywords: Price formation; Transaction price index; Index tracking; Reservation price distributions; Turnover rates; House price volatility
    JEL: C21 C51 D30 R39
    Date: 2019–04–05
    URL: http://d.repec.org/n?u=RePEc:hhs:kthrec:2019_002&r=all
  18. By: Matthes, Christian (Federal Reserve Bank of Richmond); Schwartzman, Felipe (Federal Reserve Bank of Richmond)
    Abstract: We use economic theory to rank the impact of structural shocks across sectors. This ranking helps us to identify the origins of U.S. business cycles. To do this, we introduce a Hierarchical Vector Auto-Regressive model, encompassing aggregate and sectoral variables. We find that shocks whose impact originate in the "demand" side (monetary, household, and government consumption) account for 43 percent more of the variance of U.S. GDP growth at business cycle frequencies than identified shocks originating in the "supply" side (technology and energy). Furthermore, corporate financial shocks, which theory suggests propagate to large extent through demand channels, account for an amount of the variance equal to an additional 82 percent of the fraction explained by these supply shocks.
    Keywords: Aggregate Shocks; Sectoral Data; Bayesian Analysis; Impulse Responses
    JEL: C11 C50 E30
    Date: 2019–03–29
    URL: http://d.repec.org/n?u=RePEc:fip:fedrwp:19-09&r=all

This nep-ecm issue is ©2019 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.