[go: up one dir, main page]

New Economics Papers
on Risk Management
Issue of 2007‒12‒01
fifteen papers chosen by



  1. Measuring potential market risk By Bask, Mikael
  2. Robust Value at Risk Prediction By Loriano Mancini; Fabio Trojani
  3. NoVaS Transformations: Flexible Inference for Volatility Forecasting By Dimitris Politis; Dimitrios Thomakos
  4. Basel II and financial stability: An investigation of sensitivity and cyclicality of capital requirements based on QIS 5 By Balázs Zsámboki
  5. Price Calibration of basket default swap: Evidence from Japanese market By Fathi, Abid; Nader, Naifar
  6. Copula based simulation procedures for pricing basket Credit Derivatives By Fathi , Abid; Nader, Naifar
  7. Optimal Credit Risk Transfer, Monitored Finance and Real Investment Activity By Bhattacharya, Sudipto; Chiesa, Gabriella
  8. Joint Validation of Credit Rating PDs under Default Correlation By Ricardo Schechtman
  9. Hybrid Cat-bonds By Pauline Barrieu; Henri Loubergé
  10. Investigating value and growth : what labels hide ?. By Kateryna Shapovalova; Alexander Subbotin
  11. Global and local stationary modelling in finance : theory and empirical evidence. By Dominique Guégan
  12. Análise da Coerência de Medidas de Risco no Mercado Brasileiro de Ações e Desenvolvimento de uma Metodologia Híbrida para o Expected Shortfall By Alan Cosme Rodrigues da Silva; Eduardo Facó Lemgruber; José Alberto Rebello Baranowski; Renato da Silva Carvalho
  13. New Framework for Measuring and Managing Macrofinancial Risk and Financial Stability By Dale F. Gray; Robert C. Merton; Zvi Bodie
  14. BAYESIAN ANALYSIS OF THE COMPOUND COLLECTIVE MODEL; THE VARIANCE PREMIUM PRINCIPLE WITH EXPONENTIAL POISSON AND GAMMA-GAMMA DISTRIBUTIONS By A.Hernández-Bastida; M.P. Fernández-Sánchez; E. Gómez-Deniz
  15. The Stability-Concentration Relationship in the Brazilian Banking System By Benjamin Miranda Tabak; Solange Maria Guerra; Eduardo José Araújo Lima; Eui Jung Chang

  1. By: Bask, Mikael (Bank of Finland Research)
    Abstract: The difference between market risk and potential market risk is emphasized and a measure of the latter risk is proposed. Specifically, it is argued that the spectrum of smooth Lyapunov exponents can be utilized in what we call (l, s2)-analysis, which is a method to monitor the aforementioned risk measures. The reason is that these exponents focus on the stability properties (l) of the stochastic dynamic system generating asset returns, while more traditional risk measures such as value-at-risk are concerned with the distribution of returns (s2).
    Keywords: market risk; potential market risk; smooth Lyapunov exponents; stochastic dynamic system; value-at-risk
    JEL: G11
    Date: 2007–11–13
    URL: http://d.repec.org/n?u=RePEc:hhs:bofrdp:2007_020&r=rmg
  2. By: Loriano Mancini (University of Zurich); Fabio Trojani (University of St-Gallen)
    Abstract: We propose a general robust semiparametric bootstrap method to estimate conditional predictive distributions of GARCH-type models. Our approach is based on a robust estimator for the parameters in GARCH-type models and a robustified resampling method for standardized GARCH residuals, which controls the bootstrap instability due to influential observations in the tails of standardized GARCH residuals. Monte Carlo simulation shows that our method consistently provides lower VaR forecast errors, often to a large extent, and in contrast to classical methods never fails validation tests at usual significance levels. We test extensively our approach in the context of real data applications to VaR prediction for market risk, and find that only our robust procedure passes all validation tests at usual confidence levels. Moreover, the smaller tail estimation risk of robust VaR forecasts implies VaR prediction intervals that can be nearly 20% narrower and 50% less volatile over time. This is a further desirable property of our method, which allows to adapt risky positions to VaR limits more smoothly and thus more efficiently.
    Keywords: Backtesting, M-estimator, Extreme Value Theory, Breakdown Point.
    JEL: C14 C15 C23 C59
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp0731&r=rmg
  3. By: Dimitris Politis; Dimitrios Thomakos
    Abstract: In this paper we contribute several new results on the NoVaS transformation approach for volatility forecasting introduced by Politis (2003a,b, 2007). In particular: (a) we introduce an alternative target distribution (uniform); (b) we present a new method for volatility forecasting using NoVaS ; (c) we show that the NoVaS methodology is applicable in situations where (global) stationarity fails such as the cases of local stationarity and/or structural breaks; (d) we show how to apply the NoVaS ideas in the case of returns with asymmetric distribution; and finally (e) we discuss the application of NoVaS to the problem of estimating value at risk (VaR). The NoVaS methodology allows for a flexible approach to inference and has immediate applications in the context of short time series and series that exhibit local behavior (e.g. breaks, regime switching etc.) We conduct an extensive simulation study on the predictive ability of the NoVaS approach and and that NoVaS forecasts lead to a much `tighter' distribution of the forecasting performance measure for all data generating processes. This is especially relevant in the context of volatility predictions for risk management. We further illustrate the use of NoVaS for a number of real datasets and compare the forecasting performance of NoVaS -based volatility forecasts with realized and range-based volatility measures.
    Keywords: ARCH, GARCH, local stationarity, structural breaks, VaR, volatility.
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:uop:wpaper:0005&r=rmg
  4. By: Balázs Zsámboki (Magyar Nemzeti Bank)
    Abstract: This study aims to analyse the sensitivity of capital requirements to changes in risk parameters (PD, LGD and M) by creating a ‘model bank’ with a portfolio mirroring the average asset composition of internationally active large banks, as well as locally oriented smaller institutions participating in the QIS 5 exercise. Using historical data on corporate default rates, the dynamics of risk weights and capital requirements over a whole business cycle are also examined, with special emphasis on financial stability implications. The purpose of this paper is to contribute to a better understanding of the mechanism of Basel II and to explore the possible impacts of prudential regulation on cyclical swings in capital requirements.
    Keywords: Basel II, credit risk, capital requirement, regulation, cyclicality, financial stability.
    JEL: G21 G28 G32
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:mnb:opaper:2007/67&r=rmg
  5. By: Fathi, Abid; Nader, Naifar
    Abstract: The aim of this paper is the price calibration of basket default swap from Japanese market data. The value of this instruments depend on the number of factors including credit rating of the obligors in the basket, recovery rates, intensity of default, basket size and the correlation of obligors in the basket. A fundamental part of the pricing framework is the estimation of the instantaneous default probabilities for each obligor. Because default probabilities depend on the credit quality of the considered obligor, well-calibrated credit curves are a main ingredient for constructing default times. The calibration of credit curves take into account internal information on credit migrations and default history. We refer to Japan Credit Rating Agency to obtain rating transition matrix and cumulative default rates. Default risk is often considered as a rare-event and then, many studies have shown that many distributions have fatter tails than those captured by the normal distribution. Subsequently, the choice of copula and the choice of procedures for rare-event simulation govern the pricing of basket credit derivatives. Joshi and Kainth (2004) introduced an Importance Sampling technique for rare-event that forces a predetermined number of defaults to occur on each path. We consider using Gaussian copula and t-student copula and study their impact on basket credit derivative prices. We will present an application of the Canonical Maximum Likelihood Method (CML) for calibrating t-student copula to Japanese market data.
    Keywords: Basket Default Swaps; Credit Curve; Monte Carlo method; Gaussian copula; t-student copula; Japanese market data; CML; Importance Sampling.
    JEL: G19
    Date: 2007–09–25
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:6013&r=rmg
  6. By: Fathi , Abid; Nader, Naifar
    Abstract: This paper deals with the impact of structure of dependency and the choice of procedures for rare-event simulation on the pricing of multi-name credit derivatives such as nth to default swap and Collateralized Debt Obligations (CDO). The correlation between names defaulting has an effect on the value of the basket credit derivatives. We present a copula based simulation procedure for pricing basket default swaps and CDO under different structure of dependency and assessing the influence of different price drivers (correlation, hazard rates and recovery rates) on modelling portfolio losses. Gaussian copulas and Monte Carlo simulation is widely used to measure the default risk in basket credit derivatives. Default risk is often considered as a rare-event and then, many studies have shown that many distributions have fatter tails than those captured by the normal distribution. Subsequently, the choice of copula and the choice of procedures for rare-event simulation govern the pricing of basket credit derivatives. An alternative to the Gaussian copula is Clayton copula and t-student copula under importance sampling procedures for simulation which captures the dependence structure between the underlying variables at extreme values and certain values of the input random variables in a simulation have more impact on the parameter being estimated than others .
    Keywords: Collateralized Debt Obligations; Basket Default Swaps; Monte Carlo method; One factor Gaussian copula; Clayton copula; t-student copula; importance sampling.
    JEL: G19
    Date: 2007–03
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:6014&r=rmg
  7. By: Bhattacharya, Sudipto; Chiesa, Gabriella
    Abstract: We examine the implications of optimal credit risk transfer (CRT) for bank-loan monitoring. In the model, monitoring improves expected returns on bank loans, but the loan-portfolio return distribution fails to satisfy the Monotone-Likelihood-Ratio Property (MLRP) because monitoring is most valuable in downturns. We find that CRT enhances loan monitoring and expands financial intermediation, in contrast to the findings of the previous literature, and the reference asset for optimal CRT is the loan portfolio, in line with the preponderance of portfolio products. An important implication of optimal CRT is that it allows maximum capital leverage. The intuition is that the lack of MLRP makes debt financing suboptimal, so the bank is rewarded for good luck rather than for monitoring, and it faces a tighter constraint on outside finance: incentive-based lending capacity, given bank capital, is smaller. Optimal CRT exploits the information conveyed by loan portfolio outcomes to shift income from lucky states to those that are more informative about the monitoring effort. Thus, monitoring incentives are optimized and incentive-based lending capacity is maximized. The role for prudential regulation of banks is examined.
    Keywords: Credit Risk Transfer; Monitoring Incentives; Prudential Regulation
    JEL: D61 D82 G21 G28
    Date: 2007–11
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:6584&r=rmg
  8. By: Ricardo Schechtman
    Abstract: The Basel Committee on Banking Supervision recognizes that one of the greatest technical challenges to the implementation of the new Basel II Accord lies on the validation of the banks’ internal credit rating models (CRMs). This study investigates new proposals of statistical tests for validating the PDs (probabilities of default) of CRMs. It distinguishes between proposals aimed at checking calibration and those focused at discriminatory power. The proposed tests recognize the existence of default correlation, deal jointly with the default behaviour of all the ratings and, differently to previous literature, control the error of validating incorrect CRMs. Power sensitivity analysis and strategies for power improvement are discussed, providing insights on the trade-offs and limitations pertained to the calibration tests. An alternative goal is proposed for the tests of discriminatory power and results of power dominance are shown for them with direct practical consequences. Finally, as the proposed tests are asymptotic, Monte-Carlo simulations investigate the small sample bias for varying scenarios of parameters.
    Date: 2007–10
    URL: http://d.repec.org/n?u=RePEc:bcb:wpaper:149&r=rmg
  9. By: Pauline Barrieu (London School of Economics); Henri Loubergé (University of Geneva and Swiss Finance Institute)
    Abstract: Natural catastrophes attract regularly the attention of media and have become a source of public concern. From a financial viewpoint, natural catastrophes represent idiosyncratic risks, diversifiable at the world level. But for reasons analyzed in this paper reinsurance markets are unable to cope with this risk completely. Insurance-linked securities, such as cat bonds, have been issued to complete the international risk transfer process, but their development is disappointing so far. This paper argues that downside risk aversion and ambiguity aversion explain the limited success of cat bonds. Hybrid cat bonds, combining the transfer of cat risk with protection against a stock market crash, are proposed to complete the market. Using the concept of market modified risk measure, the paper shows that replacing simple cat bonds with hybrid cat bonds would lead to an increase in market volume.
    Keywords: Risk management, Risk transfer, Catastrophes, Risk measures, Reinsurance, Optimal design
    JEL: D81 G22
    Date: 2006–02
    URL: http://d.repec.org/n?u=RePEc:chf:rpseri:rp0727&r=rmg
  10. By: Kateryna Shapovalova (Centre d'Economie de la Sorbonne et IXIS CIB); Alexander Subbotin (Centre d'Economie de la Sorbonne et Higher School of Economics)
    Abstract: Value and growth investment styles are a concept which has gained extreme popularity over the past two decades, probably due to its practical efficiency and relative simplicity. We study the mechanics of different factors' impact on excess returns in a multivariate setting. We use a panel of stock returns and accounting data from 1979 to 2007 for the companies listed on NYSE without survivor bias for clustering, regression analysis and constructing style based portfolios. Our findings suggest that value and growth labels often hide important heterogeneity of the underlying sources of risks. Many variables, conventionally used for style definitions, cannot be used jointly, because they affect returns in opposite directions. A simple truth that more variables does not necessarily mean better model nicely summaries our results. We advocate a more flexible approach to analyzing accounting-based factors of outperformance treating them separately before or instead of aggregating.
    Keywords: Style analysis, value puzzle, pricing anomalies, equity.
    JEL: E44 G11 E32
    Date: 2007–11
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:b07066&r=rmg
  11. By: Dominique Guégan (Centre d'Economie de la Sorbonne)
    Abstract: In this paper we deal with the problem of non-stationarity encountered in a lot of data sets coming from existence of multiple seasonnalities, jumps, volatility, distorsion, aggregation, etc. We study the problem caused by these non stationarities on the estimation of the sample autocorrelation function and give several examples of models for which spurious behaviors is created by this fact. It concerns Markov switching processes, Stopbreak models and SETAR processes. Then, new strategies are suggested to study locally these data sets. We propose first a test based on the k-the cumulants and mainly the construction of a meta-distribution based on copulas for the data set which will permit to take into account all the non-stationarities. This approach suggests that we can be able to do risk management for portfolio containing non stationary assets and also to obtain the distribution function of some specific models.
    Keywords: Non-stationarity, distribution function, copula, long-memory, switching, SETAR, Stopbreak models, cumulants, estimation.
    JEL: C32 C51 G12
    Date: 2007–04
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:b07053&r=rmg
  12. By: Alan Cosme Rodrigues da Silva; Eduardo Facó Lemgruber; José Alberto Rebello Baranowski; Renato da Silva Carvalho
    Abstract: This work seeks to analyze empirically the coherence of the VaR and the Expected Shortfall by the definition of Artzner et al. (1997) at the Brazilian Stock Market (Bovespa), calculated with three methodologies: the historical simulation, the analytical approach with EWMA volatility from RiskMetricsTM and the hybrid approach developed by Boudoukh et al. (1998). The sample includes the ten most traded stocks of Bovespa in November 2003 with prices covering the period from July 4th 1994 through October 31st 2003. For the purpose of backtesting, we use the test developed in Kupiec (1995) for the VaR, and the tail test elaborated in Berkowitz (2001) for the Expected Shortfall. The values of the Expected Shortfall, calculated with the three methodologies, are compared using the following criteria: the test developed in Pitman (1937), the simple mean error and the mean square error. The results show that the hybrid approach gives the closest Expected Shortfall to the loss that occurs when the VaR is violated.
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:bcb:wpaper:142&r=rmg
  13. By: Dale F. Gray; Robert C. Merton; Zvi Bodie
    Abstract: This paper proposes a new approach to improve the way central banks can analyze and manage the financial risks of a national economy. It is based on the modern theory and practice of contingent claims analysis (CCA), which is successfully used today at the level of individual banks by managers, investors, and regulators. The basic analytical tool is the risk-adjusted balance sheet, which shows the sensitivity of the enterprise's assets and liabilities to external "shocks." At the national level, the sectors of an economy are viewed as interconnected portfolios of assets, liabilities, and guarantees -- some explicit and others implicit. Traditional approaches have difficulty analyzing how risks can accumulate gradually and then suddenly erupt in a full-blown crisis. The CCA approach is well-suited to capturing such "non-linearities" and to quantifying the effects of asset-liability mismatches within and across institutions. Risk-adjusted CCA balance sheets facilitate simulations and stress testing to evaluate the potential impact of policies to manage systemic risk.
    JEL: E44 G0
    Date: 2007–11
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:13607&r=rmg
  14. By: A.Hernández-Bastida (Departamento de Métodos Cuantitativos para la Economía y la Empresa. Universidad de Granada(Spain)); M.P. Fernández-Sánchez (Departamento de Métodos Cuantitativos para la Economía y la Empresa. Universidad de Granada(Spain)); E. Gómez-Deniz (Department of Quantitative Methods in Economics, University of Las Palmas de G.C., Spain.)
    Abstract: The distribution of the aggregate claim size is the considerable importance in insurance theory since, for example, it is needed as an input in premium calculation principles and reserve calculation which plays an important paper in ruin theory. In this paper a Bayesian study for the collective risk model by incorporating a prior distribution for both, the parameter of the claim number distribution and the parameter of the claim size distribution is made and applied to the variance premium principle. Later a sensitivity study is to carry out on both parameters using Bayesian global robustness. Despite the complicated form of the collective risk model it is shown how the robustness study can be treated in an easy way. We illustrate the results obtained with numerical examples.
    Keywords: Bayesian Robustness, Contamination Class, Variance Principle.
    JEL: C11 G22
    Date: 2007–11–22
    URL: http://d.repec.org/n?u=RePEc:gra:fegper:02/07&r=rmg
  15. By: Benjamin Miranda Tabak; Solange Maria Guerra; Eduardo José Araújo Lima; Eui Jung Chang
    Abstract: In this article the relation between non-performing loans (NPL) of the Brazilian banking system and macroeconomic factors, systemic risk and banking concentration is empirically tested. While evaluating this relation, we use a dynamic specification with fixed effects, using a panel data approach. The empirical results indicate that the banking concentration has a statistically significant impact on NPL, suggesting that more concentrated banking systems may improve financial stability. These results are important for the design of banking regulation policies.
    Date: 2007–10
    URL: http://d.repec.org/n?u=RePEc:bcb:wpaper:145&r=rmg

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.