[go: up one dir, main page]

nep-rmg New Economics Papers
on Risk Management
Issue of 2022‒06‒20
twenty papers chosen by



  1. Gamma and Vega Hedging Using Deep Distributional Reinforcement Learning By Jay Cao; Jacky Chen; Soroush Farghadani; John Hull; Zissis Poulos; Zeyu Wang; Jun Yuan
  2. Forecasting Market Changes using Variational Inference By Udai Nagpal; Krishan Nagpal
  3. Hedge and Safe Haven Properties of Gold, US Treasury, Bitcoin, and Dollar/CHF against the FAANA Companies and S&P 500 By Imran Yousaf; Vasilios Plakandaras; Elie Bouri; Rangan Gupta
  4. Estimating dynamic systemic risk measures By Loïc Cantin; Christian Francq; Jean-Michel Zakoïan
  5. Volatility Sensitive Bayesian Estimation of Portfolio VaR and CVaR By Taras Bodnar; Vilhelm Niklasson; Erik Thors\'en
  6. A Unified Bayesian Framework for Pricing Catastrophe Bond Derivatives By Dixon Domfeh; Arpita Chatterjee; Matthew Dixon
  7. Modelling CDS Volatility at Different Tenures: An Application for Latin-American Countries By Fredy Gamboa-Estrada; José Vicente Romero
  8. Machine learning techniques in joint default assessment By Margherita Doria; Elisa Luciano; Patrizia Semeraro
  9. So Far, So Good: Government Insurance of Financial Sector Tail Risk By Larry D. Wall
  10. Method of indirect estimation of default probability dynamics for industry-target segments according to the data of Bank of Russia By Mikhail Pomazanov
  11. Capital requirements, market structure, and heterogeneous banks By Müller, Carola
  12. The Market-Based Asset Price Probability By Olkhov, Victor
  13. Excess Out-of-Sample Risk and Fleeting Modes By Jean-Philippe Bouchaud; Iacopo Mastromatteo; Marc Potters; Konstantin Tikhonov
  14. A re-examination of the U.S. insurance market’s capacity to pay catastrophe losses By Dionne, Georges; Desjardins, Denise
  15. A Volatility Estimator of Stock Market Indices Based on the Intrinsic Entropy Model By Claudiu Vinte; Marcel Ausloos; Titus Felix Furtuna
  16. Accounting for Risk in a Linearized Solution: How to Approximate the Risky Steady State and Around It By Pierlauro Lopez; J. David López-Salido; Francisco Vazquez-Grande
  17. Comparative risk and ambiguity aversion: an experimental approach By Takashi Hayashi; Ryoko Wada
  18. Pricing Path-dependent Options under Stochastic Volatility via Mellin Transform By Jiling Cao; Jeong-Hoon Kim; Xi Li; Wenjun Zhang
  19. Randomized geometric tools for anomaly detection in stock markets By Cyril Bachelard; Apostolos Chalkis; Vissarion Fisikopoulos; Elias Tsigaridas
  20. Machine Learning Methods: Potential for Deposit Insurance By Ryan Defina

  1. By: Jay Cao; Jacky Chen; Soroush Farghadani; John Hull; Zissis Poulos; Zeyu Wang; Jun Yuan
    Abstract: We use deep distributional reinforcement learning (RL) to develop hedging strategies for a trader responsible for derivatives dependent on a particular underlying asset. The transaction costs associated with trading the underlying asset are usually quite small. Traders therefore tend to carry out delta hedging daily, or even more frequently, to ensure that the portfolio is almost completely insensitive to small movements in the asset's price. Hedging the portfolio's exposure to large asset price movements and volatility changes (gamma and vega hedging) is more expensive because this requires trades in derivatives, for which transaction costs are quite large. Our analysis takes account of these transaction cost differences. It shows how RL can be used to develop a strategy for using options to manage gamma and vega risk with three different objective functions. These objective functions involve a mean-variance trade-off, value at risk, and conditional value at risk. We illustrate how the optimal hedging strategy depends on the asset price process, the trader's objective function, the level of transaction costs when options are traded, and the maturity of the options used for hedging.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.05614&r=
  2. By: Udai Nagpal; Krishan Nagpal
    Abstract: Though various approaches have been considered, forecasting near-term market changes of equities and similar market data remains quite difficult. In this paper we introduce an approach to forecast near-term market changes for equity indices as well as portfolios using variational inference (VI). VI is a machine learning approach which uses optimization techniques to estimate complex probability densities. In the proposed approach, clusters of explanatory variables are identified and market changes are forecast based on cluster-specific linear regression. Apart from the expected value of changes, the proposed approach can also be used to obtain the distribution of possible outcomes, which can be used to estimate confidence levels of forecasts and risk measures such as VaR (Value at Risk) for the portfolio. Another advantage of the proposed approach is the clear model interpretation, as clusters of explanatory variables (or market regimes) are identified for which the future changes follow similar relationships. Knowledge about such clusters can provide useful insights about portfolio performance and identify the relative importance of variables in different market regimes. Illustrative examples of equity and bond indices are considered to demonstrate forecasts of the proposed approach during Covid-related volatility in early 2020 and subsequent benign market conditions. For the portfolios considered, it is shown that the proposed approach provides useful forecasts in both normal and volatile markets even with only a few explanatory variables. Additionally the predicted estimate and distribution adapt quickly to changing market conditions and thus may also be useful in obtaining better real-time estimates of risk measures such as VaR compared to traditional approaches.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.00605&r=
  3. By: Imran Yousaf (School of Management, Air University, Islamabad, Pakistan); Vasilios Plakandaras (Department of Economics, Democritus University of Thrace, Komotini, 69100, Greece); Elie Bouri (School of Business, Lebanese American University, Lebanon); Rangan Gupta (Department of Economics, University of Pretoria, Private Bag X20, Hatfield 0028, South Africa)
    Abstract: The sudden market crash of 20 February 2020 arising from the COVID-19 pandemic has accelerated the digitalization phenomenon and revived the interest for risk mitigation during stress periods. In this paper, we examine the hedging, diversifying, and safe haven properties of gold, U.S. treasury bonds, Bitcoin, and Dollar/CHF for the FAANA (Facebook, Apple, Amazon, Netflix, and Alphabet) stocks and the S&P 500 index. FAANA exhibited positive returns with remarkable resilience throughout the pandemic period, suggesting a change in their investing character from risk to riskless assets. In our approach we examine both an extended sample period and an alternate focused evaluation of heightened uncertainty periods during the recent pandemic period. Furthermore, we estimate the optimal weights, hedge ratios, and hedging effectiveness for the pairs of stock and alternative assets (gold, US treasury, Bitcoin, and Dollar/CHF) during the full sample period and the COVID-19 pandemic period. Our empirical findings suggest that FAANA, once thought as risky high growth tech stocks, have matured and become a safe blanket during the latest turbulent period. onal convergence, related to the evolution of gaps in real GDP per capita, long-term interest rates and population growth across countries.
    Keywords: : Safe haven assets, Hedging, Diversification, FAANA stocks, COVID-19 outbreak
    JEL: C32 G15
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:pre:wpaper:202227&r=
  4. By: Loïc Cantin (CREST, 5 Avenue Henri Le Chatelier, 91120 Palaiseau, France); Christian Francq (CREST, 5 Avenue Henri Le Chatelier, 91120 Palaiseau, France); Jean-Michel Zakoïan (CREST, 5 Avenue Henri Le Chatelier, 91120 Palaiseau, France)
    Abstract: We propose a two-step semi-parametric estimation approach for dynamic Conditional VaR (CoVaR), from which other important systemic risk measures such as the Delta-CoVaR can be derived. The CoVaR allows to define reserves for a given financial entity, in order to limit exceeding losses when a system is in distress. We assume that all financial returns in the system follow semi-parametric GARCH-type models. Our estimation method relies on the fact that the dynamic CoVaR is the product of the volatility of the financial entity’s return and a conditional quantile term involving the innovations of the different returns. We show that the latter quantity can be easily estimated from residuals of the GARCH-type models estimated by Quasi-Maximum Likelihood (QML). The study of the asymptotic behaviour of the corresponding estimator and the derivation of asymptotic confidence intervals for the dymanic CoVaR are the main purposes of the paper. Our theoretical results are illustrated via Monte-Carlo experiments and real financial time series.
    Keywords: conditional CoVaR and Delta-CoVaR, empirical distribution of bivariate residuals, model-free estimation risk, multivariate risks.
    Date: 2022–01–24
    URL: http://d.repec.org/n?u=RePEc:crs:wpaper:2022-11&r=
  5. By: Taras Bodnar; Vilhelm Niklasson; Erik Thors\'en
    Abstract: In this paper, a new way to integrate volatility information for estimating value at risk (VaR) and conditional value at risk (CVaR) of a portfolio is suggested. The new method is developed from the perspective of Bayesian statistics and it is based on the idea of volatility clustering. By specifying the hyperparameters in a conjugate prior based on two different rolling window sizes, it is possible to quickly adapt to changes in volatility and automatically specify the degree of certainty in the prior. This constitutes an advantage in comparison to existing Bayesian methods that are less sensitive to such changes in volatilities and also usually lack standardized ways of expressing the degree of belief. We illustrate our new approach using both simulated and empirical data. Compared to some other well known homoscedastic and heteroscedastic models, the new method provides a good alternative for risk estimation, especially during turbulent periods where it can quickly adapt to changing market conditions.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.01444&r=
  6. By: Dixon Domfeh; Arpita Chatterjee; Matthew Dixon
    Abstract: Catastrophe (CAT) bond markets are incomplete and hence carry uncertainty in instrument pricing. As such various pricing approaches have been proposed, but none treat the uncertainty in catastrophe occurrences and interest rates in a sufficiently flexible and statistically reliable way within a unifying asset pricing framework. Consequently, little is known empirically about the expected risk-premia of CAT bonds. The primary contribution of this paper is to present a unified Bayesian CAT bond pricing framework based on uncertainty quantification of catastrophes and interest rates. Our framework allows for complex beliefs about catastrophe risks to capture the distinct and common patterns in catastrophe occurrences, and when combined with stochastic interest rates, yields a unified asset pricing approach with informative expected risk premia. Specifically, using a modified collective risk model -- Dirichlet Prior-Hierarchical Bayesian Collective Risk Model (DP-HBCRM) framework -- we model catastrophe risk via a model-based clustering approach. Interest rate risk is modeled as a CIR process under the Bayesian approach. As a consequence of casting CAT pricing models into our framework, we evaluate the price and expected risk premia of various CAT bond contracts corresponding to clustering of catastrophe risk profiles. Numerical experiments show how these clusters reveal how CAT bond prices and expected risk premia relate to claim frequency and loss severity.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.04520&r=
  7. By: Fredy Gamboa-Estrada; José Vicente Romero
    Abstract: Assessing the dynamics of risk premium measures and its relationship with macroeconomic fundamentals is important for both macroeconomic policymakers and market practitioners. This paper analyzes the main determinants of CDS in Latin-America at different tenures, focusing on their volatility. Using a component GARCH model, we decompose volatility between permanent and transitory components. We find that the permanent component of CDS volatility in all tenors was higher and more persistent in the global financial crisis than during the recent COVID-19 shock. **** RESUMEN: Evaluar la dinámica de las medidas de prima de riesgo y su relación con los fundamentales macroeconómicos es importante tanto para quienes implementan las políticas macroeconómicas como para los participantes del mercado. En este documento se analizan los principales determinantes de los CDS para economías de Latinoamérica a diferentes plazos, enfocándose en su volatilidad. Empleando un modelo GARCH por componentes, se realiza una descomposición de la volatilidad de los CDS a diferentes plazos entre un componente permanente y transitorio. En los resultados se encuentra que el componente permanente de la volatilidad de los CDS en todos los plazos fue mayor y más persistente durante la crisis financiera global que durante el episodio más reciente relacionado con el choque del COVID-19.
    Keywords: Credit default swaps (CDS), CDS in Latin-American countries, sovereign risk, volatility, crisis, component GARCH models, Credit default swaps (CDS), CDS de países en Latinoamérica, riesgo soberano, volatilidad, crisis, modelos GARCH por componentes
    JEL: C22 C58 G01 G15
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:bdr:borrec:1199&r=
  8. By: Margherita Doria; Elisa Luciano; Patrizia Semeraro
    Abstract: This paper studies the consequences of capturing non-linear dependence among the covariates that drive the default of different obligors and the overall riskiness of their credit portfolio. Joint default modeling is, without loss of generality, the classical Bernoulli mixture model. Using an application to a credit card dataset we show that, even when Machine Learning techniques perform only slightly better than Logistic Regression in classifying individual defaults as a function of the covariates, they do outperform it at the portfolio level. This happens because they capture linear and non-linear dependence among the covariates, whereas Logistic Regression only captures linear dependence. The ability of Machine Learning methods to capture non-linear dependence among the covariates produces higher default correlation compared with Logistic Regression. As a consequence, on our data, Logistic Regression underestimates the riskiness of the credit portfolio.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.01524&r=
  9. By: Larry D. Wall
    Abstract: The US government has intervened to provide extraordinary support 16 times from 1970 to 2020 with the goal of preventing or mitigating (or both) the cost of financial instability to the financial sector and the real economy. This article discusses the motivation for such support, reviewing the instances where support was provided, along with one case where it was expected but not provided. The article then discusses the moral hazard and fiscal risks posed by the government's insurance of the tail risk along with ways to reduce the government's risk exposure.
    Keywords: financial stability; FDIC; Federal Reserve; Treasury; bailout; financial history
    JEL: F33 F36 G18 G21 G23 G28 G32 H12 H6 N22
    Date: 2021–11–08
    URL: http://d.repec.org/n?u=RePEc:fip:a00001:94154&r=
  10. By: Mikhail Pomazanov
    Abstract: A direct method for calculating default rates by industry and target corporate segments is not possible given the lack of statistical data. The proposed paper considers a model for filtering the dynamics of the probability of default of corporate companies and other borrowers based on indirect data on the dynamics of overdue debt supplied by the Bank of Russia. The model is based on the equation of the balance of total and overdue debts, the missing links of the corresponding time series are built using the Hodrick_Prescott filtering method. In retail lending segments (mortgage, consumer lending), default statistics are available and supplied by Credit Bureaus. The presented method is validated on this statistic. Over a historical limited period, validation has shown that the result is trustworthy. The resulting default probability series are exogenous variables for macro_economic modelling of sectoral credit risks.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.05984&r=
  11. By: Müller, Carola
    Abstract: Bank regulators interfere with the efficient allocation of resources for the sake of financial stability. Based on this trade-off, I compare how different capital requirements affect default probabilities and the allocation of market shares across heterogeneous banks. In the model, banks' productivity determines their optimal strategy in oligopolistic markets. Higher productivity gives banks higher profit margins that lower their default risk. Hence, capital requirements indirectly aiming at highproductivity banks are less effective. They also bear a distortionary cost: Because incumbents increase interest rates, new entrants with low productivity are attracted and thus average productivity in the banking market decreases.
    Keywords: bank competition,bank regulation,Basel III,capital requirements,heterogeneous banks,leverage ratio
    JEL: G11 G21 G28
    Date: 2022
    URL: http://d.repec.org/n?u=RePEc:zbw:iwhdps:152022&r=
  12. By: Olkhov, Victor
    Abstract: This paper introduces the market-based asset price probability during time averaging interval Δ. We substitute the present problem of guessing the “correct” form of the asset price probability by description of the price probability as function of the market trade value and volume statistical moments during Δ. We define n-th price statistical moments as ratio of n-th statistical moments of the trade value to n-th statistical moments of the trade volume. That definition states no correlations between time-series of n-th power of the trade volume and price during Δ, but doesn’t result statistical independence between the trade volume and price. The set of price n-th statistical moments defines Taylor series of the price characteristic function. Approximations of the price characteristic function that reproduce only first m price statistical moments, generate approximations of the market-based price probability. That approach unifies probability description of market-based asset price, price indices, returns, inflation and their volatilities. Market-based price probability approach impacts the asset pricing models and uncovers hidden troubles and usage bounds of the widespread risk hedging tool – Value-at-Risk, lets you determine the price autocorrelations and revises the classical option pricing from one to two dimensional problem. Market-based approach doesn’t simplify the price probability puzzle but establishes direct economic ties between asset pricing, market randomness and economic theory. Description of the market-based price and returns volatility, Skewness and Kurtosis requires development of economic theories those model relations between second, third and forth order macroeconomic variables. Development of these theories will take a lot of efforts and years.
    Keywords: asset price; price probability; returns; inflation; market trades
    JEL: C01 C58 E31 E37 G12 G17
    Date: 2022–05–15
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:113096&r=
  13. By: Jean-Philippe Bouchaud; Iacopo Mastromatteo; Marc Potters; Konstantin Tikhonov
    Abstract: Using Random Matrix Theory, we propose a universal and versatile tool to reveal the existence of "fleeting modes", i.e. portfolios that carry statistically significant excess risk, signalling ex-post a change in the correlation structure in the underlying asset space. Our proposed test is furthermore independent of the "true" (but unknown) underlying correlation structure. We show empirically that such fleeting modes exist both in futures markets and in equity markets. We proposed a metric to quantify the alignment between known factors and fleeting modes and identify momentum as a source of excess risk in the equity space.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.01012&r=
  14. By: Dionne, Georges (HEC Montreal, Canada Research Chair in Risk Management); Desjardins, Denise (HEC Montreal, Canada Research Chair in Risk Management)
    Abstract: Cummins, Doherty, and Lo (2002) present a theoretical and empirical analysis of the capacity of the property liability insurance industry in the U.S. to finance catastrophic losses. In their theoretical analysis, they show that a sufficient condition for capacity maximization is for all insurers to hold a net of reinsurance underwriting portfolio that is perfectly correlated with aggregate industry losses. Estimating capacity from insurers’ financial statement data, they find that the U.S. insurance industry could adequately fund a $100 billion event in 1997. As a matter of comparison, Hurricane Katrina in 2005 cost the insurance industry $40 to $55 billion (2005 dollars). Our main objective is to update the study of Cummins et al (2002) with new data available up to the end of 2020. We verify how the insurance market’s capacity has evolved over recent years. We show that the U.S. insurance industry’s capacity to pay catastrophe losses is higher in 2020 than it was in 1997. Insurers could pay 98% of a $200 billion loss in 2020 in comparison to 81% in 1997.
    Keywords: Catastrophe loss; US insurance industry; industry capacity; reinsurance; climate finance; climate risk
    JEL: D53 D81 G22 G52 Q54 Q57
    Date: 2022–05–11
    URL: http://d.repec.org/n?u=RePEc:ris:crcrmw:2022_002&r=
  15. By: Claudiu Vinte; Marcel Ausloos; Titus Felix Furtuna
    Abstract: Grasping the historical volatility of stock market indices and accurately estimating are two of the major focuses of those involved in the financial securities industry and derivative instruments pricing. This paper presents the results of employing the intrinsic entropy model as a substitute for estimating the volatility of stock market indices. Diverging from the widely used volatility models that take into account only the elements related to the traded prices, namely the open, high, low, and close prices of a trading day (OHLC), the intrinsic entropy model takes into account the traded volumes during the considered time frame as well. We adjust the intraday intrinsic entropy model that we introduced earlier for exchange-traded securities in order to connect daily OHLC prices with the ratio of the corresponding daily volume to the overall volume traded in the considered period. The intrinsic entropy model conceptualizes this ratio as entropic probability or market credence assigned to the corresponding price level. The intrinsic entropy is computed using historical daily data for traded market indices (S&P 500, Dow 30, NYSE Composite, NASDAQ Composite, Nikkei 225, and Hang Seng Index). We compare the results produced by the intrinsic entropy model with the volatility estimates obtained for the same data sets using widely employed industry volatility estimators. The intrinsic entropy model proves to consistently deliver reliable estimates for various time frames while showing peculiarly high values for the coefficient of variation, with the estimates falling in a significantly lower interval range compared with those provided by the other advanced volatility estimators.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.01370&r=
  16. By: Pierlauro Lopez; J. David López-Salido; Francisco Vazquez-Grande
    Abstract: We propose a novel approximation of the risky steady state and construct first-order perturbations around it for a general class of dynamic equilibrium models with time-varying and non-Gaussian risk. We offer analytical formulas and conditions for their local existence and uniqueness. We apply this approximation technique to models featuring Campbell-Cochrane habits, recursive preferences, and time-varying disaster risk, and show how the proposed approximation represents the implications of the model similarly to global solution methods. We show that our approximation of the risky steady state cannot be generically replicated by higher-order perturbations around the deterministic steady state, which cannot account well for the effects of risk in our applications even up to third order. Finally, we argue that our perturbation can be viewed as a generalized version of the heuristic loglinear-lognormal approximations commonly used in the macro-finance literature.
    Keywords: Perturbation methods; Risky steady state; Macroeconomic uncertainty; Solving dynamic equilibrium models; Time-varying risk premia
    JEL: C63 G12 E32 E44
    Date: 2022–05–11
    URL: http://d.repec.org/n?u=RePEc:fip:fedcwq:94186&r=
  17. By: Takashi Hayashi (University of Glasgow); Ryoko Wada (Keiai University)
    Abstract: This paper experimentally studies comparative properties of risk aversion and ambiguity aversion in the way that the role of heterogeneity is allowed for. We examine correlation between the degrees of risk aversion and the degree of ambiguity aversion, how the latter changes across geometric properties of objective sets of possible probability distributions and how ambiguous information is generated.
    Keywords: Ambiguity aversion, risk aversion, comparative definitions, choice experiments
    JEL: C91 D81
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:1079&r=
  18. By: Jiling Cao; Jeong-Hoon Kim; Xi Li; Wenjun Zhang
    Abstract: In this paper, we derive closed-form formulas of first-order approximation for down-and-out barrier and floating strike lookback put option prices under a stochastic volatility model, by using an asymptotic approach. To find the explicit closed-form formulas for the zero-order term and the first-order correction term, we use Mellin transform. We also conduct a sensitivity analysis on these formulas, and compare the option prices calculated by them with those generated by Monte-Carlo simulation.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.00573&r=
  19. By: Cyril Bachelard; Apostolos Chalkis; Vissarion Fisikopoulos; Elias Tsigaridas
    Abstract: We propose novel randomized geometric tools to detect low-volatility anomalies in stock markets; a principal problem in financial economics. Our modeling of the (detection) problem results in sampling and estimating the (relative) volume of geodesically non-convex and non-connected spherical patches that arise by intersecting a non-standard simplex with a sphere. To sample, we introduce two novel Markov Chain Monte Carlo (MCMC) algorithms that exploit the geometry of the problem and employ state-of-the-art continuous geometric random walks (such as Billiard walk and Hit-and-Run) adapted on spherical patches. To our knowledge, this is the first geometric formulation and MCMC-based analysis of the volatility puzzle in stock markets. We have implemented our algorithms in C++ (along with an R interface) and we illustrate the power of our approach by performing extensive experiments on real data. Our analyses provide accurate detection and new insights into the distribution of portfolios' performance characteristics. Moreover, we use our tools to show that classical methods for low-volatility anomaly detection in finance form bad proxies that could lead to misleading or inaccurate results.
    Date: 2022–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2205.03852&r=
  20. By: Ryan Defina (International Association of Deposit Insurers)
    Abstract: The field of deposit insurance is yet to realise fully the potential of machine learning, and the substantial benefits that it may present to its operational and policy-oriented activities. There are practical opportunities available (some specified in this paper) that can assist in improving deposit insurers’ relationship with the technology. Sharing of experiences and learnings via international engagement and collaboration is fundamental in developing global best practices in this space.
    Keywords: deposit insurance, bank resolution
    JEL: G21 G33
    Date: 2021–09
    URL: http://d.repec.org/n?u=RePEc:awl:finbri:3&r=

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.