[go: up one dir, main page]

nep-ecm New Economics Papers
on Econometrics
Issue of 2008‒11‒18
25 papers chosen by
Sune Karlsson
Orebro University

  1. Estimation of Dynamic Models with Nonparametric Simulated Maximum Likelihood By Dennis Kristensen; Yongseok Shin
  2. Inference in Regression Models with Many Regressors By Stanislav Anatolyev
  3. Specification Testing in Models with Many Instruments By Stanislav Anatolyev; Nikolay Gospodinov
  4. A Quasi Maximum Likelihood Approach for Large Approximate Dynamic Factor Models By Catherine Doz; Domenico Giannone; Lucrezia Reichlin
  5. Modeling Non-Linear Spatial Dynamics: A Family of Spatial STAR Models and an Application to U.S. Economic Growth By Pede, Valerien O.; Florax, Raymond J.G.M.; Holt, Matthew T.
  6. Short-Term Forecasts of Euro Area GDP Growth By Elena Angelini; Gonzalo Camba-Mendez; Domenico Giannone; Lucrezia Reichlin; Gerhard Rünstler
  7. Opening the Black Box: Structural Factor Models with Large Cross-Sections By Mario Forni; Domenico Giannone; Marco Lippi; Lucrezia Reichlin
  8. Large Bayesian VARs By Marta Banbura; Domenico Giannone; Lucrezia Reichlin
  9. On the Correlation Structure of Microstructure Noise in Theory and Practice By Francis X. Diebold; Georg H. Strasser
  10. "Realized Volatility, Covariance and Hedging Coefficient of the Nikkei-225 Futures with Micro-Market Noise" By Naoto Kunitomo; Seisho Sato
  11. Impact of time–inhomogeneous jumps and leverage type effects on returns and realised variances By Almut E. D. Veraart
  12. Temporal aggregation of univariate and multivariate time series models: A survey By Andrea Silvestrini; David Veredas
  13. Estimating Farm Level Multivariate Yield Distribution Using Nonparametric Methods By Zheng, Qiujie; Wang, H. Holly; Shi, Qinghua
  14. Modeling Censored Data Using Mixture Regression Models with an Application to Cattle Production Yields By Belasco, Eric J.; Ghosh, Sujit K.
  15. A Simple Hypothesis Test for Heteroscedasticity By Venier, Guido
  16. Optimal Linear Filtering, Smoothing and Trend Extraction for m-period Differences of Processes with a Unit Root By Dimitrios Thomakos
  17. Are High-Tech Employment and Natural Amenities Linked?: Answers from a Smoothed Bayesian Spatial Model By Dorfman, Jeffrey H.; Patridge, Mark D.; Galloway, Hamilton
  18. HEDONIC PRICE FUNCTIONS: GUIDANCE ON EMPIRICAL SPECIFICATION By Kuminoff, Nicolai V.; Parmeter, Christopher F.; Pope, Jaren C.
  19. Modelling acreage decisions within the multinomial logit framework : profit functions and discrete choice models By Carpentier, Alain; Letort, Elodie
  20. Quantile Regression Methods of Estimating Confidence Intervals for WASDE Price Forecasts By Isengildina-Massa, Olga; Irwin, Scott H.; Good, Darrel L.
  21. Recovering Preferences from a Dual-Market Locational Equilibrium By Kuminoff, Nicolai V.
  22. Estimation of Efficiency with the Stochastic Frontier Cost Function and Heteroscedasticity: A Monte Carlo Study By Kim, Taeyoon; Brorsen, Wade; Kenkel, Philip
  23. Spatial Price Adjustment with and without Trade By Stephens, Emma C.; Mabaya, Edward
  24. A Spatial Hedonic Model with Time-Varying Parameters: A New Method Using Flexible Least Squares By Kuethe, Todd H.; Foster, Kenneth A.; Florax, Raymond J.G.M.
  25. Bayesian Estimation of a Censored AIDS Model for Whole Grain Products By ISHDORJ, Ariun; JENSEN, Helen H.

  1. By: Dennis Kristensen; Yongseok Shin (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: We propose a simulated maximum likelihood estimator for dynamic models based on non- parametric kernel methods. Our method is designed for models without latent dynamics from which one can simulate observations but cannot obtain a closed-form representation of the like- lihood function. Using the simulated observations, we nonparametrically estimate the density - which is unknown in closed form - by kernel methods, and then construct a likelihood func- tion that can be maximized. We prove for dynamic models that this nonparametric simulated maximum likelihood (NPSML) estimator is consistent and asymptotically efficient. NPSML is applicable to general classes of models and is easy to implement in practice.
    Keywords: dynamic models, estimation, kernel density estimation, maximum-likelihood, simulation
    JEL: C13 C14 C15 C32 C35
    Date: 2008–11–13
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-58&r=ecm
  2. By: Stanislav Anatolyev (New Economic School)
    Abstract: We investigate the behavior of various standard and modified F, LR and LM tests in linear regressions, adapting an alternative asymptotic framework where the number of regressors and possibly restrictions grows proportionately to the sample size. When restrictions are not numerous, the rescaled classical test statistics are asymptotically chi-squared irrespective of whether there are many or few regressors. However, when restrictions are numerous, standard asymptotic versions of classical tests are invalid. We propose and analyze asymptotically valid versions of the classical tests, including those that are robust to the numerosity of regressors. We also compare their higher order asymptotic size properties and powers for different types of local alternatives. It turns out that an "exact" F test that appeals to critical values of the F distribution is best in terms of such properties.
    Keywords: Alternative asymptotics, linear regression, test size, test power, F test, Wald test, Likelihood Ratio test, Lagrange Multiplier test.
    JEL: C12 C21
    Date: 2008–09
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0125&r=ecm
  3. By: Stanislav Anatolyev (New Economic School); Nikolay Gospodinov (Concordia University and CIREQ)
    Abstract: This paper studies the asymptotic validity of the Anderson-Rubin (AR) test and the J test of overidentifying restrictions in linear models with many instruments. When the number of instruments increases at the same rate as the sample size, we establish that the conventional AR and J tests are asymptotically incorrect. Some versions of these tests, that are developed for situations with moderately many instruments, are also shown to be asymptotically invalid in this framework. We propose modifications of the AR and J tests that deliver asymptotically correct sizes. Importantly, the corrected tests are robust to the numerosity of the moment conditions in the sense that they are valid for both few and many instruments. The simulation results illustrate the excellent properties of the proposed tests.
    Keywords: Instrumental variables, many instruments, Bekker?s asymptotics, Anderson? Rubin test, test for overidentifying restrictions.
    JEL: C12 C21
    Date: 2008–09
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0124&r=ecm
  4. By: Catherine Doz; Domenico Giannone; Lucrezia Reichlin
    Abstract: Is maximum likelihood suitable for factor models in large cross-sections of time series? We answer this question from both an asymptotic and an empirical perspective. We show that estimates of the common factors based on maximum likelihood are consistent for the size of the cross-section (n) and the sample size (T) going to infinity along any path of n and T and that therefore maximum likelihood is viable for n large. The estimator is robust to misspecification of the cross-sectional and time series correlation of the the idiosyncratic components. In practice, the estimator can be easily implemented using the Kalman smoother and the EM algorithm as in traditional factor analysis.
    Keywords: Factor Model, large cross-sections, Quasi Maximum Likelihood
    JEL: C51 C32 C33
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2008_034&r=ecm
  5. By: Pede, Valerien O.; Florax, Raymond J.G.M.; Holt, Matthew T.
    Abstract: This paper investigates non-linearity in spatial processes models and allows for a gradual regime-switching structure in the form of a smooth transition autoregressive process. Until now, applications of the smooth transition autoregressive (STAR) model have been largely confined to the time series context. The paper focuses on extending the non-linear smooth transition perspective to spatial processes models, in which spatial correlation is taken into account through the use of a so-called weights matrix identifying the topology of the spatial system. We start by deriving a non-linearity test for a simple spatial model, in which spatial correlation is only included in the transition function. Next, we propose a non-linearity test for a model that includes a spatially lagged dependent variable or spatially autocorrelated innovations as well. Monte Carlo simulations of the various test statistics are performed to examine their power and size. The proposed modeling framework is then used to identify convergence clubs in the context of U.S. county-level economic growth over the period 1963€Ӳ003.
    Keywords: spatial econometrics, non-linearity, utoregressive smooth transition, Research Methods/ Statistical Methods, C12, C21, C51, O18, R11,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6518&r=ecm
  6. By: Elena Angelini; Gonzalo Camba-Mendez; Domenico Giannone; Lucrezia Reichlin; Gerhard Rünstler
    Abstract: This paper evaluates models that exploit timely monthly releases to compute early estimates of current quarter GDP (now-casting) in the euro area. We compare traditional methods used at institutions with a new method proposed by Giannone, Reichlin, and Small (2005). The method consists in bridging quarterly GDP with monthly data via a regression on factors extracted from a large panel of monthly series with different publication lags. We show that bridging via factors produces more accurate estimates than traditional bridge equations. We also show that survey data and other ‘soft’ information are valuable for now-casting.
    Keywords: Forecasting, Monetary Policy, Factor Model, Real Time Data, Large data-sets, News
    JEL: E52 C33 C53
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2008_035&r=ecm
  7. By: Mario Forni; Domenico Giannone; Marco Lippi; Lucrezia Reichlin
    Abstract: This paper shows how large-dimensional dynamic factor models are suitable for structural analysis. We argue that all identification schemes employed in SVAR analysis can be easily adapted in dynamic factor models. Moreover, the “problem of fundamentalness”, which is intractable in structural VARs, can be solved, provided that the impulse-response functions are sufficiently heterogeneous. We provide consistent estimators for the impulse-response functions, as well as (n, T) rates of convergence. An exercise with US macroeconomic data shows that our solution of the fundamentalness problem may have important empirical consequences.
    Keywords: Dynamic factor models, structural VARs, identification, fundamentalness
    JEL: E0 C1
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2008_036&r=ecm
  8. By: Marta Banbura; Domenico Giannone; Lucrezia Reichlin
    Abstract: This paper shows that Vector Autoregression with Bayesian shrinkage is an appropriate tool for large dynamic models. We build on the results by De Mol, Giannone, and Reichlin (2008) and show that, when the degree of shrinkage is set in relation to the cross-sectional dimension, the forecasting performance of small monetary VARs can be improved by adding additional macroeconomic variables and sectoral information. In addition, we show that large VARs with shrinkage produce credible impulse responses and are suitable for structural analysis.
    Keywords: Bayesian VAR, Forecasting, Monetary VAR, large cross-sections
    JEL: C11 C13 C33 C53
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2008_033&r=ecm
  9. By: Francis X. Diebold (Department of Economics, University of Pennsylvania); Georg H. Strasser (Department of Economics, Boston College)
    Abstract: We argue for incorporating the financial economics of market microstructure into the financial econometrics of asset return volatility estimation. In particular, we use market microstructure theory to derive the cross-correlation function between latent returns and market microstructure noise, which feature prominently in the recent volatility literature. The cross-correlation at zero displacement is typically negative, and cross-correlations at nonzero displacements are positive and decay geometrically. If market makers are sufficiently risk averse, however, the cross-correlation pattern is inverted. Our results are useful for assessing the validity of the frequently-assumed independence of latent price and microstructure noise, for explaining observed crosscorrelation patterns, for predicting as-yet undiscovered patterns, and for making informed conjectures as to improved volatility estimation methods.
    Keywords: Realized volatility, Market microstructure theory, High-frequency data, Financial econometrics
    JEL: G14 G20 D82 D83 C51
    Date: 2008–10–09
    URL: http://d.repec.org/n?u=RePEc:boc:bocoec:693&r=ecm
  10. By: Naoto Kunitomo (Faculty of Economics, University of Tokyo); Seisho Sato (Institute of Statistical Mathematics)
    Abstract: For the estimation problem of the realized volatility, covariance and hedging coefficient by using high frequency data with possibly micro-market noises, we use the Separating Information Maximum Likelihood (SIML) method, which was recently developed by Kunitomo and Sato (2008). By analyzing the Nikkei 225 futures and spot index markets, we have found that the estimates of realized volatility, covariance and hedging coefficient have significant bias by the traditional method which should be corrected. Our method can handle the estimation bias and the tick-size effects of Nikkei 225 futures by removing the possible micro-market noise in multivariate high frequency data.
    Date: 2008–11
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2008cf601&r=ecm
  11. By: Almut E. D. Veraart (School of Economics and Management, University of Aarhus, Denmark)
    Abstract: This paper studies the effect of time–inhomogeneous jumps and leverage type effects on realised variance calculations when the logarithmic asset price is given by a Lévy–driven stochastic volatility model. In such a model, the realised variance is an inconsistent estimator of the integrated variance. Nevertheless it can be used within a quasi–maximumlikelihood setup to draw inference on the model parameters. In order to do that, this paper introduces a new methodology for deriving all cumulants of the returns and realised variance in explicit form by solving a recursive system of inhomogeneous ordinary differential equations.
    Keywords: Lévy processes, stochastic volatility, leverage effect, superposition, realised variance
    JEL: C10 C13 C14 G10 G12
    Date: 2008–11–10
    URL: http://d.repec.org/n?u=RePEc:aah:create:2008-57&r=ecm
  12. By: Andrea Silvestrini (Bank of Italy, Economics and International Relations and CORE, Université catholique de Louvain.); David Veredas (ECARES, Université Libre de Bruxelles and CORE, Université catholique de Louvain, Belgium)
    Abstract: We present a unified and up-to-date overview of temporal aggregation techniques for univariate and multivariate time series models explaining in detail how these techniques are employed. Some empirical applications illustrate the main issues.
    Keywords: Temporal aggregation, ARIMA, Seasonality, GARCH, Vector ARMA, Spurious causality, Multivariate GARCH
    JEL: C10 C22 C32 C43
    Date: 2008–08
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_685_08&r=ecm
  13. By: Zheng, Qiujie; Wang, H. Holly; Shi, Qinghua
    Abstract: Modeling crop yield distributions has been an important topic in agricultural production and risk analysis, and nonparametric methods have gained attention for their flexibility in describing the shapes of yield density functions. In this article, we apply a nonparametric method to model joint yield distributions based on farm-level data for multiple crops, and also provide a way of simulation for univariate and bivariate distributions. The results show that the nonparametric models, both univariate and bivariate, are estimated quite well compared to the original samples, and the simulated empirical distributions also preserve the attributes of the original samples at a reasonable level. This article provides a feasible way of using multivariate nonparametric methods in further risk and insurance analysis.
    Keywords: yield distribution, multi-variate nonparametric, China, farm-level, risks, Farm Management, Risk and Uncertainty,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6509&r=ecm
  14. By: Belasco, Eric J.; Ghosh, Sujit K.
    Abstract: This research develops a mixture regression model that is shown to have advantages over the classical Tobit model in model fit and predictive tests when data are generated from a two step process. Additionally, the model is shown to allow for flexibility in distributional assumptions while nesting the classic Tobit model. A simulated data set is utilized to assess the potential loss in efficiency from model misspecification, assuming the Tobit and a zero-inflated log-normal distribution, which is derived from the generalized mixture mdoel. Results from simulations key on the finding that the the proposed zero-inflated log-normal model clearly outperforms the Tobit model when data are generated from a two step process. When data are generated from a Tobit model, forecats are more accurate when utilizing the Tobit model. However, the Tobit model will be shown to be a special case of the generalized mixture model. The empirical model is then applied to evaluating mortality rates in commercial cattle feedlots, both independently and as part of a system including other performance and health factors. This particular application is hypothesized to be more apropriate for the proposed model due to the high degree of censoring and skewed nature of mortality rates. The zero-inflated log-normal model clearly models and predicts with more accuracy that the tobit model.
    Keywords: censoring, livestock production, tobit, zero-inflated, bayesian, Livestock Production/Industries,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6341&r=ecm
  15. By: Venier, Guido
    Abstract: Abstract: The scope of this paper is the presentation of a simple hypothesis test that enables to discern heteroscedastic data from homoscedastic i.i.d. gaussian white noise. The main feature will be a test statistic that’s easy applicable and serves well in committing such a test. The power of the statistic will be underlined by examples where it is applied to stock market data and time series from deterministic diffusion a chaotic time series process. It will turn out that in those cases the statistic rejects with a high degree of confidence the random walk hypothesis and is therefore highly reliable. Furthermore it will be discussed, that the test in most cases also may serve as a test for independence and heteroscedasticity in general. This will be exemplified by independent and equally distributed random numbers.
    Keywords: Heteroscedasticity; Hypothesis Test; Independence; Random Walk
    JEL: C12 C01
    Date: 2008–11–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:11591&r=ecm
  16. By: Dimitrios Thomakos
    Abstract: In this paper I consider the problem of optimal linear filtering, smoothing and trend extraction for m-period differences of processes with a unit root. Such processes arise naturally in economics and finance, in the form of rates of change (price inflation, economic growth, financial returns) and finding an appropriate smoother is thus of immediate practical interest. The filter and resulting smoother are based on the methodology of Singular Spectrum Analysis (SSA) and their form and properties are examined in detail. In particular, I find explicit representations for the asymptotic decomposition of the covariance matrix and show that the first two leading eigenvalues of the decomposition account for over 90% of the variability of the process. I examine the structure of the impulse and frequency response functions finding that the optimal filter has a “permanent” and a “transitory component” with the corresponding smoother being the sum of two such components. I also find explicit representations for the extrapolation coefficients that can be used in out-of-sample prediction. The methodology of the paper is illustrated with three short empirical applications using data on U.S. inflation and real GDP growth and data on the Euro/US dollar exchange rate. Finally, the paper contains a new technical result: I derive explicit representations for the filtering weights in the context of SSA for an arbitrary covariance matrix. This result allows one to examine specific effects of smoothing in any situation and has not appeared so far, to the best of my knowledge, in the related literature.
    Keywords: core inflation, business cycles, differences, euro, linear filtering, singular spectrum analysis, smoothing, trading strategies, trend extraction and prediction, unit root.
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:uop:wpaper:0030&r=ecm
  17. By: Dorfman, Jeffrey H.; Patridge, Mark D.; Galloway, Hamilton
    Abstract: We investigate the recently advanced theory that high-technology workers are drawn to high amenity locations and then the high-technology jobs follow the workers. Using a novel data set that tracks high-technology job growth by U.S. county, we estimate spatial parameters of the response of job growth to the level of local natural amenities. We achieve this estimation with a reasonably new class of models, smooth coefficient models. The model is employed in a spatial setting to allow for smooth, but nonparametric response functions to key variables in an otherwise standard regression model. With spatial data this allows for flexible modeling such as a unique place-specific effects to be estimated for each location, and also for the responses to key variables to vary by location. This flexibility is achieved through the non-parametric smoothing rather than by nearest-neighbor type estimators such as in geographically weighted regressions. The resulting model can be estimated in a straightforward application of analytical Bayesian techniques. Our results show that amenities can definitely have a significant effect on high-technology employment growth; however, the effect varies over space and by amenity level.
    Keywords: Bayesian econometrics, employment growth, high technology, smooth coefficient models, spatial modeling., Labor and Human Capital, Resource /Energy Economics and Policy,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6459&r=ecm
  18. By: Kuminoff, Nicolai V.; Parmeter, Christopher F.; Pope, Jaren C.
    Abstract: The hedonic pricing model is widely accepted as a method for estimating the marginal willingness to pay for spatially delineated amenities. Empirical applications typically rely on one of three functional forms€Ԭinear, semi-log, and double-log€ԡnd rarely involve rigorous specification testing. This phenomenon is largely due to an influential simulation study by Cropper, Deck and McConnell (CDM) (1988) that found, among other things, that simpler linear specifications outperformed more flexible functional forms in the face of omitted variables. In the 20 years that have elapsed since their study, there have been major computational advances and significant changes in the way hedonic price functions can be estimated. The purpose of our paper is to update and extend the CDM (1988) simulations to investigate current issues in hedonic modeling. Three preliminary results obtained from our theoretically consistent Monte Carlo simulation have been highlighted in this paper: (i) we find that adding spatial fixed effects (census tract dummies) to linear models does improve their performance. This is true both when all attributes are observed, and when some attributes are unobserved, (ii) adding the spatial fixed effects to the more flexible specifications such as the quadratic and quadratic box-cox does not improve their performance when all housing attributes are observed. However, when some housing attributes are unobserved, the spatial fixed effects significantly improves the performance of flexible specifications as well, and (iii) increasing the sample size from CDM€ٳ 200 observations to a sample size of 2000 (which is more representative of modern applications) changes the relative performance of different specifications.
    Keywords: Hedonic, Functional Form, Monte Carlo Simulation, Property Value Model, Demand and Price Analysis, Land Economics/Use, Research Methods/ Statistical Methods, Q15, Q51, Q53, C15, R52,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6555&r=ecm
  19. By: Carpentier, Alain; Letort, Elodie
    Abstract: The first purpose of this paper is to propose theoretical justifications for using Logit acreage share models. Two approaches are presented: the Logit shares can be derived from a well defined profit function or derived as the result of a set of discrete choices. It is next shown that both theoretical frameworks allow to define generalizations of the standard Logit shares. These generalizations build on developments of the Multinomial Logit framework for modelling discrete choices and seek to define models that are flexible and empirically tractable. Two applications are presented to illustrate the empirical interest of the proposed models. Both use a rotating panel of French farms (1987-2006) and consider the estimation of yield functions, variable input demand functions and acreage share functions.
    Keywords: Crop Production/Industries, Farm Management,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6234&r=ecm
  20. By: Isengildina-Massa, Olga; Irwin, Scott H.; Good, Darrel L.
    Abstract: This paper explores the use of quantile regression for estimation of empirical confidence limits for WASDE forecasts of corn, soybean, and wheat prices. Quantile regressions for corn, soybean, and wheat forecast errors over 1980/81 through 2006/07 were specified as a function of forecast lead time. Estimated coefficients were used to calculate forecast intervals for 2007/08. The quantile regression approach to calculating forecast intervals was evaluated based on out-of-sample performance. The accuracy of the empirical confidence intervals was not statistically different from the target level about 87% of the time prior to harvest and 91% of the time after harvest.
    Keywords: Demand and Price Analysis,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6409&r=ecm
  21. By: Kuminoff, Nicolai V.
    Abstract: This paper develops a new structural estimator that uses the properties of a market equilibrium, together with information on households and their observed location choices, to recover horizontally differentiated preferences for a vector of local public goods. The estimation is consistent with equilibrium capitalization of local public goods and recognizes that job and house location choices are interrelated. By using set identification to distinguish the identifying power of restrictions on the indirect utility function from the identifying power of assumptions on the distribution of preferences, the estimator provides a new perspective on characteristics-based models of the demand for a differentiated product. The estimator is used to recover distributions of the marginal willingness-to-pay for improved air quality in Northern California€ٳ two largest population centers: the San Francisco and Sacramento metropolitan areas. The average marginal willingness-to-pay increases by up to 190% when job opportunities are included as a dimension of location choice.
    Keywords: Consumer/Household Economics, Institutional and Behavioral Economics, Research Methods/ Statistical Methods,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aare08:5989&r=ecm
  22. By: Kim, Taeyoon; Brorsen, Wade; Kenkel, Philip
    Abstract: The objective of this article is to address heteroscedasticity in the stochastic frontier cost function using aggregated data and verify it using a Monte Carlo study. We find that when the translog form of a stochastic frontier cost function with aggregated data is estimated, all explanatory variables can inversely affect the variation of error terms. Our Monte Carlo study shows that heteroscedasticity is only significant in the random effect and the unexplained error term not in the inefficiency error term. Also, it does not cause biases, which is quite opposite of previous research. These are because our model is approximately defined by first order Taylor series around zero inefficiency area. But, disregarding heteroscedasticity causes the average inefficiency to be overestimated when the variation of inefficiency term dominates the other error terms.
    Keywords: Resource /Energy Economics and Policy,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6408&r=ecm
  23. By: Stephens, Emma C.; Mabaya, Edward
    Abstract: In this paper we introduce a switching error correction model (SECM) estimator that allows for the possibility that price transmission between markets might vary during periods with and without physical trade flows. Applying this new approach to semi-weekly data on tomato markets in Zimbabwe, we find that intermarket price adjustment occurs quickly and as much when there is no trade as when product flows from one market to another. This finding underscores the importance of information flow for market performance.
    Keywords: Marketing, Research Methods/ Statistical Methods, Q13, R12, C32, P42,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6538&r=ecm
  24. By: Kuethe, Todd H.; Foster, Kenneth A.; Florax, Raymond J.G.M.
    Abstract: The following paper outlines a new econometric model designed to capture both the temporal and spatial dynamics of housing prices. The paper combines existing spatial econometric techniques with a model that allows parameters to evolve over time. In addition, we provide an empirical application to the price effects of confined animal feeding operations to a data set of residential real estate in Tippecanoe County, Indiana from 1993 through 2006.
    Keywords: Demand and Price Analysis, Land Economics/Use, Livestock Production/Industries, Research Methods/ Statistical Methods,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6306&r=ecm
  25. By: ISHDORJ, Ariun; JENSEN, Helen H.
    Abstract: When using household-level data in examining consumer's demand it is common to find that consumers purchase only a subset of the available goods, setting the demand for the remaining goods to zero. Ignoring such censoring of the dependent variables in the estimation can lead to biased parameter estimates. In this paper we investigate the household's demand for six types of whole grain and non-whole grain breakfast cereals and products using a censored Almost Ideal Demand System (AIDS) and estimate the parameters of the demand system via Bayesian methods. Using 2006 ACNielsen Homescan data we find that demand for whole grain and non-whole grain ready-to-eat cereals is less responsive to changes in prices; demand for whole-grain bars and non-whole grain hot cereals is relatively price sensitive. The elasticity estimates show that whole grain ready-to-eat cereals and whole grain bars have relatively higher expenditure elasticities than is the case for the other goods.
    Keywords: AIDS model, Bayesian econometrics, censored, cereals, whole grains, Demand and Price Analysis, C11, C34, D12,
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:ags:aaea08:6075&r=ecm

This nep-ecm issue is ©2008 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.