[go: up one dir, main page]

nep-cmp New Economics Papers
on Computational Economics
Issue of 2016‒11‒06
fifteen papers chosen by



  1. Market Liquidity and Systemic Risk in Government Bond Markets: A Network Analysis and Agent-Based Model Approach By Toshiyuki Sakiyama; Tetsuya Yamada
  2. On Credible Monetary Policies under Model Uncertainty By Ignacio Presno; Anna Orlik
  3. Potential Economic Effects of the Reduction in Agricultural and Nonagricultural Trade Barriers in the Transatlantic Trade and Investment Partnership By Caesar, Cororaton; David, Orden
  4. Evaluating Economy-Wide Benefit Cost Analyses By V. Kerry Smith; Min Qiang Zhao
  5. Financial Bubble Detection : A Non-Linear Method with Application to S&P 500 By Michaelides, Panayotis G.; Tsionas, Efthymios; Konstantakis, Konstantinos
  6. Putting subjective well-being to use for ex-ante policy evaluation By Xavier Jara; Erik Schokkaert
  7. The preemptive stochastic resource-constrained project scheduling problem: an efficient globally optimal solution procedure By Stefan Creemers
  8. A computational comparison of formulations for a multi-period facility location problem with modular capacity adjustments and flexible demand fulfillment By Correia, Isabel; Melo, Teresa
  9. Inefficiency and Self-Determination: Simulation-based evidence from Meiji Japan By Eric Weese; Masayoshi Hayashi; Masashi Nishikawa
  10. Numerical study of splitting methods for American option valuation By Karel in 't Hout; Radoslav Valkov
  11. Multifractal cross wavelet analysis By Zhi-Qiang Jiang; Wei-Xing Zhou; H. Eugene Stanley
  12. Option pricing in exponential L\'evy models with transaction costs By Nicola Cantarutti; Jo\~ao Guerra; Manuel Guerra; Maria do Ros\'ario Grossinho
  13. Optimal Fiscal Policy in a Model with Uninsurable Idiosyncratic Shocks By Marcelo Zouain Pedroni; Sebastian Dyrda
  14. ORDINAL VERSUS CARDINAL VOTING RULES: A MECHANISM DESIGN APPROACH By SEMIN KIM
  15. Model-independent pricing with insider information: a Skorokhod embedding approach By Beatrice Acciaio; Alexander M. G. Cox; Martin Huesmann

  1. By: Toshiyuki Sakiyama (Deputy Director, Economic and Financial Studies Division, Institute for Monetary and Economic Studies (currently Financial Markets Department), Bank of Japan (E-mail: toshiyuki.sakiyama@boj.or.jp)); Tetsuya Yamada (Director, Economic and Financial Studies Division, Institute for Monetary and Economic Studies, Bank of Japan (E-mail: tetsuya.yamada@boj.or.jp))
    Abstract: Recently, market liquidity in government bond markets has been attracting attention by market participants and central bankers since interest rate spikes have become frequent under unconventional monetary easing. We analyze network structures in the JGB (Japanese government bond) market using daily data from the BOJ-NET (the Bank of Japan Financial Network System). To our knowledge, this is the first network analysis on the government bond market. We studies how QQE (quantitative and qualitative monetary easing) has affected JGB market structure. We also conduct event studies for the spikes in interest rates (the shock after the introduction of QQE and the so-called VaR [Value at Risk] shock in 2003). In addition, we propose an agent-based model that accounts for the findings of the above event studies, and show that not only the capital adequacy of market participants but also the network structure are important for financial market stability.
    Keywords: Market Liquidity, Government bond markets, Quantitative and Qualitative Easing, Network analysis, Systemic risk, Agent-based model
    JEL: C58 G12 G18
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:ime:imedps:16-e-13&r=cmp
  2. By: Ignacio Presno (Universidad de Montevideo); Anna Orlik (Federal Reserve Board of Governors)
    Abstract: This paper studies the design of optimal time-consistent monetary policy in an economy where the planner trusts his own model, while a representative household uses a set of alternative probability distributions governing the evolution of the exogenous state of the economy. In such environments, unlike in the original studies of time-consistent monetary policy, management of households’ expectations becomes an active channel of optimal policymaking per se; a feature that our paternalistic government seeks to exploit. We adapt recursive methods in the spirit of Abreu, Pearce, and Stacchetti (1990) as well as computational algorithms based on Judd, Yeltekin, and Conklin (2003) to fully characterize the equilibrium outcomes for a class of policy games between the government and a representative household that distrusts the model used by the government.
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:red:sed016:1280&r=cmp
  3. By: Caesar, Cororaton; David, Orden
    Abstract: The objective of the paper is to provide a preliminary assessment of the potential economic effects in the U.S. and EU28 of a reduction in their bilateral trade barriers. Using a global CGE model the paper develops four trade barrier reduction scenarios and analyzed their impact on trade, production, factor prices, and welfare in the two economies for a 10-year period through 2024 compared to a baseline without reductions. The scenarios are: (i) 90% reduction in tariffs only; (ii) 90% and 20% reductions in tariffs and NTMs, respectively, for all sectors; (iii) 90% and 20% reductions in tariffs and NTMs in non-agriculture only; and (iv) 90% reduction in both tariffs and NTMs. Results indicate largest percentage increases in bilateral trade for agriculture/food sectors when liberalization includes these sectors, but that most of the gains are in non-agriculture due to its predominance in production and initial trade flows. Only the fourth scenario reverses the baseline downward trend through 2024 in U.S.-EU28 bilateral trade as a share of their global totals.
    Keywords: Transatlantic Trade and Investment Partnership (TTIP), Regional trade, United States, European Union 28, Global computable general equilibrium (CGE) model, Tariffs, Non-tariff measures (NTM)
    JEL: C68 D58 F15
    Date: 2016–04–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:74773&r=cmp
  4. By: V. Kerry Smith; Min Qiang Zhao
    Abstract: This paper examines a new strategy for evaluating whether the size of a new environmental regulation requires that benefit cost analyses consider general equilibrium effects. Size in the context refers to both the magnitude and distribution of cost increases across sectors and the benefits attributed to the rules. Rogerson’s [2008] static, general equilibrium model describing how tax policy affects time allocations between market and non-market activities is extended to include air pollution as a non-separable element in the representative household’s preferences. The paper makes three contributions to the literature. First, the calibrated parameters of the model are used to evaluate how the introduction of air quality, as a non-separable, external influence on the household’s non-market activities, affects the conventional explanation for the labor market transition in developed economies. Second, all current CGE assessments of conventional environmental policies in the U.S. and Europe ignore the feedback effects of policies for emissions and behavior. This analysis demonstrates their importance by using an amended version of the Rogerson model to compare calibrations with and without these feedback effects. Finally, a calibrated model is used to gauge the plausibility of the benefit estimates from EPA’s partial equilibrium (PE) assessment of the recent Clean Power Plan. This analysis finds the upper limit of the PE estimates for the annual ancillary benefits of the plan (due to its effects on conventional air pollutants) is implausibly large.
    JEL: D58 D61 H41
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:22769&r=cmp
  5. By: Michaelides, Panayotis G.; Tsionas, Efthymios; Konstantakis, Konstantinos
    Abstract: The modeling process of bubbles, using advanced mathematical and econometric techniques, is a young field of research. In this context, significant model misspecification could result from ignoring potential non- linearities. More precisely, the present paper attempts to detect and date non- linear bubble episodes. To do so, we use Neural Networks tocapture the neglected non-linearities. Also, we provide a recursive dating procedure for bubble episodes. When using data on stock price-dividend ratio S&P500 (1871.1-2014.6), employing Bayesian techniques, the proposed approach identifies more episodes than otherbubble tests in the literature, while the common episodes are, in general, found to have a longer duration, which is evidence of an early warning mechanism (EWM) thatcouldhave important policy implications.
    Keywords: Bubbles, Non-linearities, Neural Networks, EWM, S&P500
    JEL: G01 G17 G18
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:74477&r=cmp
  6. By: Xavier Jara; Erik Schokkaert
    Abstract: Most studies using microsimulation techniques have considered the effect of potential reforms, but only regarding income distribution. However, it has become increasingly recognised, both at the academic and political level, that focusing purely on income provides a limited picture of social progress. We illustrate how ex-ante policy evaluation can be performed in terms of richer concepts of individual well-being, such as subjective life satisfaction and equivalent incomes. Our analysis makes use of EUROMOD, the EU-wide tax-benefit microsimulation model, along with 2013 EU-SILC data for Sweden, which for the first time provides information on subjective well-being. Our results show that the effect of potential reforms varies widely depending on the well-being concept used in the evaluation. We discuss the normative questions that are raised by this finding.
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:ete:ceswps:553932&r=cmp
  7. By: Stefan Creemers
    Abstract: We present a globally optimal solution procedure to tackle the preemptive stochastic resource-constrained project scheduling problem (PSRCPSP). A solution to the PSRCPSP is a policy that allows to construct a precedence- and resource-feasible schedule that minimizes the expected makespan of a project. The PSRCPSP is an extension of the stochastic resource-constrained project scheduling problem (SRCPSP) that allows activities to be interrupted. The SRCPSP and PSRCPSP both assume that activities have stochastic durations. Even though the deterministic preemptive resource-constrained project scheduling problem (PRCPSP) has received some attention in the literature, we are the first to study the PSRCPSP. We use phase-type distributions to model the stochastic activity durations, and define a new Continuous-Time Markov Chain (CTMC) that drastically reduces memory requirements when compared to the well-known CTMC of Kulkarni and Adlakha (1986). In addition, we also propose a new and efficient approach to structure the state space of the CTMC. These improvements allow us to easily outperform the current state-of-the-art in optimal project scheduling procedures, and to solve instances of the PSPLIB J90 and J120 data sets. Last but not least, if activity durations are exponentially distributed, we show that elementary policies are globally optimal for the SRCPSP and the PSRCPSP.
    Keywords: Project management, Continuous-time Markov chain, PSRCPSP
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:ete:kbiper:553966&r=cmp
  8. By: Correia, Isabel; Melo, Teresa
    Abstract: We consider a multi-period facility location problem that takes into account changing trends in customer demands and costs. To this end, new facilities can be established at pre-specified potential locations and initially existing facilities can be closed over a planning horizon. Furthermore, facilities operate with modular capacities that can be expanded or contracted over multiple periods. A distinctive feature of our problem is that two customer segments are considered with different sensitivity to delivery lead times. Customers in the first segment require timely demand satisfaction, whereas customers in the second segment tolerate late deliveries. A tardiness penalty cost is incurred to each unit of demand that is satisfied with delay. We propose two alternative mixed-integer linear formulations to redesign the facility network over the time horizon at minimum cost. Additional inequalities are developed to enhance the original formulations. A computational study is performed with randomly generated instances and using a general-purpose solver. Useful insights are derived from analyzing the impact of several parameters on network redesign decisions and on the overall cost, such as different demand patterns and varying values for the maximum delivery delay tolerated by individual customers.
    Keywords: facility location,multi-period,capacity expansion and contraction,delivery lateness,mixed-integer linear models
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:zbw:htwlog:11&r=cmp
  9. By: Eric Weese (Graduate School of Economics, Kobe University); Masayoshi Hayashi (University of Tokyo); Masashi Nishikawa (Aoyama Gakuin University)
    Abstract: We consider a model in which the arrangement of political boundaries involves a tradeoff between efficiencies of scale and geographic heterogeneity. If jurisdiction formation is decentralized, we show how mixed integer programming can be used to calculate core partitions via a sequence of myopic deviations. Using historical data from Japan regarding a set of centralized boundary changes, we estimate parameters using moment inequalities and find that core partitions always exist. In a counterfactual world in which there are no between-village income differences, these core partitions are extremely close to the partition that would be chosen by a utilitarian central planner. When actual cross-village income differences are used, however, sorting on income results in mergers that are both smaller and geographically bizarre.
    Date: 2016–09
    URL: http://d.repec.org/n?u=RePEc:koe:wpaper:1627&r=cmp
  10. By: Karel in 't Hout; Radoslav Valkov
    Abstract: This paper deals with the numerical approximation of American-style option values governed by partial differential complementarity problems. For a variety of one- and two-asset American options we investigate by ample numerical experiments the temporal convergence behaviour of three modern splitting methods: the explicit payoff approach, the Ikonen-Toivanen approach and the Peaceman-Rachford method. In addition, the temporal accuracy of these splitting methods is compared to that of the penalty approach.
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1610.09622&r=cmp
  11. By: Zhi-Qiang Jiang (ECUST, BU); Wei-Xing Zhou (ECUST); H. Eugene Stanley (BU)
    Abstract: Complex systems are composed of mutually interacting components and the output values of these components are usually long-range cross-correlated. We propose a method to characterize the joint multifractal nature of such long-range cross correlations based on wavelet analysis, termed multifractal cross wavelet analysis (MFXWT). We assess the performance of the MFXWT method by performing extensive numerical experiments on the dual binomial measures with multifractal cross correlations and the bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. For binomial multifractal measures, the empirical joint multifractality of MFXWT is found to be in approximate agreement with the theoretical formula. For bFBMs, MFXWT may provide spurious multifractality because of the wide spanning range of the multifractal spectrum. We also apply the MFXWT method to stock market indexes and uncover intriguing joint multifractal nature in pairs of index returns and volatilities.
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1610.09519&r=cmp
  12. By: Nicola Cantarutti; Jo\~ao Guerra; Manuel Guerra; Maria do Ros\'ario Grossinho
    Abstract: We present an approach for pricing a European call option in presence of proportional transaction costs, when the stock price follows a general exponential L\'evy process. The model is a generalization of the celebrated work of Davis, Panas and Zariphopoulou (1993), where the value of the option is found using the concept of utility indifference price. This requires to solve two stochastic singular control problems in finite time, satisfying the same Hamilton-Jacobi-Bellman equation and with different terminal conditions. We solve numerically the continuous time optimization problem using the Markov chain approximation method, and consider the underlying stock following an exponential Merton jump-diffusion process. This model takes into account the possibility of portfolio bankruptcy. We show numerical results for the simpler case of an infinitely rich investor, whose probability of default can be ignored. Option prices are obtained for both the writer and the buyer.
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1611.00389&r=cmp
  13. By: Marcelo Zouain Pedroni (University of Amsterdam); Sebastian Dyrda (University of Toronto)
    Abstract: This paper studies optimal taxation in the standard incomplete markets model. We formulate a Ramsey problem and solve numerically for the optimal (time varying) paths of proportional capital and labor income taxes, (possibly negative) lump-sum transfers, and government debt. The solution maximizes welfare along the transition between an initial steady state, calibrated to replicate key features of the US economy, and an endogenously determined final steady state. We find that in the optimal (utilitarian) policy: (i) capital income taxes are front-loaded hitting the imposed upper bound of 100 percent for 33 years before decreasing to 45 percent in the long-run; (ii) labor income taxes are reduced to less than half of their initial level, from 28 percent to about 13 percent in the long-run; and (iii) the government accumulates assets over time reducing the debt-to-output ratio from 63 percent to ô€€€17 percent in the long-run. This leads to an average welfare gain equivalent to a permanent 4.9 percent increase in consumption. Though distortive, taxes reduce the variance both cross-sectionally and over time of after-tax income, increasing welfare for both a redistributive and an insurance motive which we quantify.
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:red:sed016:1245&r=cmp
  14. By: SEMIN KIM (Yonsei University)
    Abstract: We consider the performance and incentive compatibility of voting rules in a Bayesian environment: agents have independent private values, there are at least three alternatives, and monetary transfers are prohibited. First, we show that in a neutral environment, meaning alternatives are symmetric ex-ante, essentially any ex-post Pareto efficient ordinal rule is incentive compatible. Importantly, however, we can improve upon ordinal rules. We show that we can design an incentive compatible cardinal rule which achieves higher utilitarian social welfare than any ordinal rule. Finally, we provide numerical findings about incentive compatible cardinal rules that maximize utilitarian social welfare.
    Keywords: Ordinal rule, Pareto efficiency, Incentive compatibility, Bayesian mechanism design.
    JEL: C72 D01 D02 D72 D82
    Date: 2016–11
    URL: http://d.repec.org/n?u=RePEc:yon:wpaper:2016rwp-94&r=cmp
  15. By: Beatrice Acciaio; Alexander M. G. Cox; Martin Huesmann
    Abstract: In this paper, we consider the pricing and hedging of a financial derivative for an insider trader, in a model-independent setting. In particular, we suppose that the insider wants to act in a way which is independent of any modelling assumptions, but that she observes market information in the form of the prices of vanilla call options on the asset. We also assume that both the insider's information, which takes the form of a set of impossible paths, and the payoff of the derivative are time-invariant. This setup allows us to adapt recent work of Beiglboeck, Cox and Huesmann (2016) to prove duality results and a monotonicity principle, which enables us to determine geometric properties of the optimal models. Moreover, we show that this setup is powerful, in that we are able to find analytic and numerical solutions to certain pricing and hedging problems.
    Date: 2016–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1610.09124&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.