No summary is available for this article.
In finance, risk can be defined as the probability of not getting an expected return. Many mechanisms or contracts make it possible to shift this risk to third parties in return for payment, which therefore corresponds to the price of the risk. This is a central element of finance. This issue of the REF addresses three aspects of this concept. The first part analyzes the determinants that constitute risk: individual perception, frequency, stability, risk aversion, etc. As for the second part, it focuses on the prices of certain specific, well identified actual risks, some of which play an increasing role, such as cyber risks, natural disasters, and dependency. Finally, the third part focuses on the price of certain synthetic risks, such as the equity risk premium, which result from the combination of elementary risks and represent a real challenge for analysts.
In this issue, the Review also publishes a financial history column devoted to the concept of odious debts and its evolution. Finally, the issue contains as well a review of the future of the eurozone, in addition to a "miscellaneous" article concerning the decentralization of cryptocurrencies.
publication : June 2019 318 pages
The aim of this paper is to analyze the consequences of introducing heterogeneous beliefs and impatience rates in otherwise standard valuation models. We first show that the various arguments put forward in the literature to deny this heterogeneity or to neglect its effects are contradicted both by the data and by the most recent works. We then introduce a typology of these beliefs, we analyze their various forms and we study how the market aggregates them. We deduce the impact of this heterogeneity of perceptions on the risk premium (or the market price of risk) and on the price of time (discount rate or interest rate). In particular, we show that, in the long run, the heterogeneity of perceptions should lead to a more cautious valuation: lower discount rate and higher risk premium.
Risk aversion is the well documented psychological bias that makes us refuse to participate in zero-sum lotteries: to accept a 50 % chance loss of one dollar, we need to be offered a 50 % chance gain of more than 1 dollar, e.g. two dollars instead of one – a compensation for taking risk. Through the same mechanism, investors must be rewarded for accepting to take on financial risk: they are compensated by receiving greater average returns than risk-free assets – the equity risk premium. We have evidence that this compensation for risk varies over time: investors can expect to receive greater returns when the indice levels are low relative to distributed dividends than when they are high. Does it come from time variations in risk aversion or are there other forces at play? Indirect evidence suggests not only the compensation for risk but also the quantity of risk in financial markets may vary over time. Variations in risk aversion would thus not be the sole channel for changes in equity risk premia over time, though it may still play an important role. Disentangling the two sources of variations in the price of risk – risk aversion and quantity of risk – and understanding why and how they may vary over time is at the core of the recent research in asset pricing theory.
The outflows of funds under management in private banks as well as in other wealth
management institutions in the recent period have reached surprising amounts,
sometimes imperiling the temporary profitability of the institutions they flow out from.
The present article therefore suggests designing a proactive client-risk management
grounded on a convincing explanation of such outflows. An innovative implementation
of article 25 of the MIFID-II regulation could have provided an appropriate basis to this
end. The present article shows that such is not the case, given the tools of implementation
essentially selected, based on fixed questioning and qualitative scoring, leading to results
opposed to the ones the notion of client suitability intended to reach. Beyond this
assessment, the paper dwells on a synthetic view of the evolution of risk analysis and shows
that only today’s generalized appraisal of risk tolerance, beyond Von Neumann
Morgenstern’s vision, can provide the explanation searched for. The authors recommend
using digital interactive tools, based on this model and quickly “learning” the investor’s
risk-profile, to implement a proactive client-risk management. This solution at the same
time restores a true compliance to MIFID-II regulation.
Economic theory teaches that the price of diversifiable risk is zero. However, this is not the case in practice: the price of the best diversifiable risks covered by insurers in the context of a competitive market, such as auto insurance, does not tend towards zero and even not tend to decrease in the long run. This paradox can be explained by the combination of many factors that most often have no value in themselves and must be combined to arrive at a more or less accurate picture of reality. The purpose of this article is to provide food for thought to better understand the determinants of the actual market situation.
In the first part, we will come back to the notion of risk, diversifiable risk and diversifiable risk prices, in order to make sure of what is really supposed to tend towards zero. In a second part, we will examine the different causes of this deviation and seek to validate some of them, without however resorting to the rigor of the quantitative analysis which exceeds the limits of this article. In the third part, we will ask ourselves why we do not observe, at the very least, a downward trend in the price of diversifiable risk. In the fourth part we will examine the ambivalent role of prudential regulation of insurance in this respect, the term “ambivalent” being strictly factual, without any negative connotations.
The pricing of risk is a core driver of the insurance industry's effectiveness in improving system resilience. Insurers have two core functions. On the asset side, they are long-term investors, to match their duration of liabilities. With their huge asset base of about USD 30 trillion, insurers can act as capital market shock absorbers. On the liability side, insurers help households, businesses and governments protect themselves against potential financial losses arising from the occurrence of adverse events. On both sides, the pricing of risk is a key determinant of how much insurers can contribute to making society more resilient.
The Gollier Commission about risk pricing for public decision-making proposed using “social risk premiums”, favoring projects that provide insurance at the level of global wealth and penalizing those whose fundamentals are correlated with economic activity. The analysis of why implementation of this approach proves difficult is instructive. In particular, the coherence to be ensured between the macroeconomic risk premium and the discount rate appears essential. The scope for social risk pricing goes beyond cost-benefit analysis of public investments and environmental policies. We also need such references for the evaluation of all security or safety regulations; or to guide the State-shareholder for choosing the appropriate weighted average cost of capital of its operators; and sectoral regulators for establishing access fees. However, corresponding policies generally take place in second-best contexts, for which we must further integrate the links between risk assessment and risk management.
In this article, we address the issue of the price of longevity risk. We begin by describing the risk of longevity and its components, distinguishing biometric, financial and regulatory aspects. We then explain the different valuation frameworks (actuarial, financial and regulatory), their common points and their differences. We discuss the issue of discounting and modeling long-term interest rates for longevity risk management. We also give details on the subjective and pragmatic way to handle different components of longevity risk, especially the most extreme, in the market.
In this article, we present several formulas that insurance companies could adopt to repay the costs of dependency. The two most common formulas are the lump-sum formula of paying a fixed and permanent annuity and the indemnity formula which consists of reimbursing a ceiling for all expenses incurred during a certain period. Neither of these two formulas avoids that in case of long dependence the insured does not spend all his wealth, or even depend on his children. To cover this risk of long-term dependence, we recommend applying the principle of deductible that beyond a certain period of dependence, all costs would be reimbursed by the insurer.
In an increasingly digital economy, Cyber risks are growing. Faced with the multitude of vulnerabilities and attack techniques, any quantitative approach to this risk must rely on invariants such as the nature of the consequences for corporations or the motivation of the attackers. Expertise in cyber-security is needed to build a methodological framework capable of integrating these risk determinants but also to overcome the lack of experience and to build a forward-looking approach to adapt to the constant developments in the battlefield of cybernetic warfare.
The digital industry seems to push back the limits of the network effect. The concentration of the players, the standardization of the products and the connectivity of the activities create the conditions for risk accumulation on a large scale. Insurers and reinsurers are thus faced with the pressing need to model this catastrophe risk and must in a few years walk a path that has taken decades for natural risks.
Whether the cyber risk is individual or catastrophic, the insurance industry must create the price signal necessary to reorient the purchasing or usage patterns of economic agents, and in so doing, foster the development of the security and diversity necessary for the sustainability of a digitized economy.
Faced with the global externality of climate change, there is a strong consensus among academic economists in favor of the price instrument to internalize an externality and to support by means of a tax a universal carbon price, because of the global nature of the climate externality. However, given the very strong long-term uncertainties on climate dynamics, on green technological progress, on the credibility of the commitments of the different countries, on economic growth and therefore on the evolution of emissions, the determination of this price is still a huge challenge for climate economists today. It is a question of valuing present actions generating a flow of economic and environmental benefits, which is very complicated in this context. In this article, I offer a synthesis of my recent work on the subject by presenting the two possible methods: the cost-benefit approach in which the price of carbon is the present value of the flow of damage avoided by reducing the current emissions of ‘a tonne of CO2 and the cost-effectiveness approach in which a CO2 budget for the century is fixed a priori, the price of carbon being the dual variable of this climate constraint.
The price of risk in the evaluation of public policies and investments is tackled here starting on one hand on the way in which the question of risk was discussed in several working groups at France Stratégie (which prepared the main references used in the frameworks of the socio economic evaluation of state investments) and on the other hand on the way in which these tools are in fact mobilized in socio-economic evaluations (which have been observed in the studies appraised in recent years in France by the General Secretariat for Investment). The article then discusses the main challenges that need to be addressed to improve the practice of studies on major projects and to ensure that the appraisals take into account adequately the risks and uncertainties surrounding expected benefits and costs and so they can be really useful in public debate and decision-making.
The equity premium measures the return obtained by investing in equities in excess of a short-term Treasury bill return. In the last thirty years, the financial literature has proposed various risk models to rationalize the magnitude of this premium, which amounts to an annual 6 % to 8 % for most industrialized countries for the post-war period. We review the various models and explain their increased complexity to capture not only the mean premium but also its variability, its predictability at various horizons and its term structure.
Our modern world is giving more and more attention to flexibility. This tendency appears in economic analysis through the increasing interest given to dynamic programming and option values. Without aiming at providing a full view of such large fields, the article tries to make clear and illustrate the most remarkable general result: the decision should be optimised, and generally postponed, in order to take advantage of the increasing information which will happen as time passes. In terms of economic analysis, and especially of dealing with the discount rate, the consequences are different according to the type of decision, whether it is related to a private good, priced through financial markets and goods markets, or a public good, designed in order to maximize some form of welfare.
Following the sub-prime crisis, in 2010 the European Parliament asked the Commission to implement a tax on financial transactions intended to discourage purely speculative financial transactions. To date, only France (2012) and Italy (2013) have introduced such a tax. The French financial transaction tax experiment has been studied in a number of academic articles, the main empirical results of which are quite concordant. However, the recommendations they draw from are quite different. To try to understand these differences, we will present the different arguments against the tax. We then explain why we are not convinced by them, then we formulate a series of proposals for extending the FTT. These proposals, if adopted, would in our view lead to a genuine FTT, which could increase tax revenues without much negative impact.
This paper examines the degree of decentralization of cryptocurrencies and provides comparisons with standard currencies. First, we show that cryptocurrencies are hybrid systems that are both centralized and decentralized. Their software architecture is decentralized, their logical working is centralized, and their institutional and organizational dimensions are partially centralized. Second, we show that the blockchain community is actively working to make the logical and organizational dimensions more decentralized. Doing so, creators of crypto-currencies rediscover some fundamental mechanisms of the banking system.