Professional Documents
Culture Documents
MANAGEMENT
By
Dr. BOUALEM BENDJILALI
ABSTRACT:
Risk management is one of the most important innovations of the 20th century.
Large derivatives losses and other financial incidents raised banks consciousness
of risk. Banks became subject to regulatory capital requirements, internationally
coordinated by the Basle Committee of the Bank of International Settlements.
Financial institutions have set up research departments to develop new quantitative
tools to deal with the issue .Our presentation will give light on some of the
quantitative tools that are used in the management of risk and in particular
financial risk management. Our presentation will have six sections. The first
section discusses the financial risk and its importance whereas the second section
presents the nature of the challenge .Section 3 discusses the types of financial risks
.Section 4 formulates the concept of the loss function .In section 5 we will
present quantitatively define the most important measures inherent to the risk
among them the Value at Risk (VaR) and the Expected Shorfall (ES). Finally,
section 6 presents briefly the importance of the time series models in risk
management.
OUTLINE
Steinherr (1998) said that Risk management: one of the most important
innovations of the 20th century.
The late 20th century saw a revolution on financial markets. It was an era of
innovation in academic theory, product development (derivatives) and
information technology.
Large derivatives losses and other financial incidents raised banks consciousness
of risk.
Banks became subject to regulatory capital requirements, internationally
coordinated by the Basle Committee of the Bank of International Settlements.
II. Quantitative Risk Management : The Nature of the
Challenge
The challenge is to put current practice onto a mathematical framework where,
concepts like profit and loss distributions, risk factors, risk measures, capital
allocation and risk aggregation are given formal mathematical definitions. It is
therefore important to know the interrelation between the different types of risks as
well as the different quantitative and non quantitative methods needed to treat the
subject.
Extremes Matter
Alan Greenspan in the joint Central Bank research conference in 1995 said that
From the point of view of the risk manager, inappropriate use of the normal
distribution can lead to an understatement of risk, which must be balanced against
the significant advantage of simplification. From the central banks corner, the
consequences are even more serious because we often need to concentrate on the
left tail of the distribution in formulating lender-of-last-resort policies. Improving
the characterization of the distribution of extreme values is of paramount
importance.
John Meriwether wrote in the Wall Street Journal in August 2000 With
globalization increasing, youll see more crises. Our whole focus is on the
extremes now - whats the worst that can happen to you in any situation - because
we never want to go through that [LTCM] again. We need models that capture
the related phenomena of heavy tails , volatility and extreme values.
[John Meriwether, The Wall Street Journal, 21st August 2000]
This led scholars to develop mathematical models that deal with phenomena
with extreme values.
The Business week newspaper (1998) wrote Extreme, synchronized rises and
falls in financial markets occur infrequently but they do occur. The problem with
the models is that they did not assign a high enough chance of occurrence to the
scenario in which many things go wrong at the same timethe perfect storm
scenario.
In a perfect storm scenario the risk manager discovers that the diversification he
thought he had is illusory; practitioners describe this also as a concentration of risk.
Concentration Risk
Scholes ( 2000) said in his article in the American Economic Review Journal
Over the last number of years, regulators have encouraged financial entities to use
portfolio theory to produce dynamic measures of risk. VaR, the product of
portfolio theory, is used for short-run, day-to-day profit and loss exposures. Now is
the time to encourage financial institutions and other regulatory bodies to support
studies on stress test and concentration methodologies. Planning for crises is more
important than VaR analysis. And such new methodologies are the correct
response to recent crises in the financial industry.
Interdisciplinary
The quantitative risk manager of the future should have a combined skillset that
includes concepts, techniques and tools from many fields:
Mathematical finance;
Statistics and financial econometrics;
Actuarial mathematics;
Non-quantitative skills, especially communication skills;
III.Types of Financial Risks
Financial risks can be broadly classified into several categories, namely market
risk, credit risk, liquidity risk, operational risk, and legal risk.
Market risk Market risk is the risk of a change in the value of a financial
position due to changes in the value of the underlying components on which that
position depends, such as stock and bond prices, exchange rates, commodity
prices, etc. That is the risk of loss arising from changes in the value of tradable or
traded assets.
Credit risk is the risk of loss due to the failure of the counterparty to pay the
promised obligation.
Liquidity risk is the risk of loss arising from the inability either to meet payment
obligations (funding liquidity risk) or to liquidate positions with little price impact
(asset liquidity risk).
Operational risk is the risk of loss caused by inadequate or failed internal
processes, people and systems, or external events.
Legal risk is the risk of loss arising from uncertainty about the enforceability of
contracts.
Consider a portfolio and let Vt denote its value at time t; we assume this random
variable is observable at time t. Suppose we look at risk from perspective of time t
and we consider the time period [t, t + 1]. The value Vt +1 of the portfolio at the
end of the time period is unknown to us. The distribution of (Vt+1 Vt) is known
as the profit-and-loss or P&L distribution. We denote the loss by
By this convention, losses will be positive numbers and profits negative. We refer
to the distribution of Lt+1 as the loss distribution.
Generally the value of the portfolio at time t will depend on time and a set of
observable risk factors Let Zt = (Zt,1 Zt,2 . . . Zt,n ) denotes the vector of risk
factors . Formally, we can write
Vt = f(t; Zt ) where f : R+ x Rn R (2
[ ( ) ( )] [ ( ) ( )] ( )
As of time t only random part is the risk factor change Xt+1. Hence loss distribution
is determined by the mapping f and by the distribution of risk factor change.
Sometimes we use a linearized version of (3).
[ ( ) ( ) ] ( )
Where subscripts denote partial derivatives and where is the risk management
horizon.
Consider n stocks; let i denote number of shares in stock i at time t and let St,i
denote the price. The risk factors: Following standard convention, the risk factors
may be taken by taking the logarithmic prices
Zt,i = Ln St,1 ; (5
( )
[ ] ( ) ( )
Where is relative weight of stock i at time t. Here there is no
explicit time dependence in the mapping (7). The partial derivatives with respect to
risk factors are
( ) for
and hence the linearized loss ( Right hand side of equation ) will be equal to
( )
While the original Basel Accord focused primarily on the (credit) risks associated
with the issuer, the 1996 amendment sought to give more coverage to market risk.
In the amendment, there are certain regulatory requirements on the internal models
from which the banks calculate their capital requirements.
One requirement is that the risk management group in charge of the
development and execution of these models should be independent of the
business units it monitors and should report directly to senior management.
Another requirement is that besides calculating the regulatory capital
requirements, these models should be fully integrated into the banks risk
measurement and management, and back testing and stress testing should be
performed on their performance on a regular basis.
Discrete Returns
Let Pt denote the asset price at time t. Suppose the asset does not have
dividends over the period from time t to time t.
The one-period net return on this asset is Rt = (Pt Pt- )/Pt , and the one-
period gross return is Pt/Pt Rt .
The gross return over k periods is defined as
( ) (
and the net return over these periods is Rt(k). In practice, we usually use years
as the time unit. The annualized gross return for holding an asset over k years is
[ ( )] , and the annualized net return is . [ ( )]
An n-period log return is the sum of n simple single-period log returns (the
additive of multi-period returns):
( ) ( )
Risk measures attempt to quantify the riskiness of a portfolio. Most risk measures
are statistics of the loss distribution such as Variance , Value at Risk (VaR) or
Expected Shortfall (ES) .The most popular risk measure like VaR describe the
right tail of the loss distribution.
ES(p) = [ ( )] ( )
Advantages of ES
ES takes the whole tail of the distribution beyond VaR into account; in
particular ES > VaR.
ES has better properties regarding aggregation of risk. This is related to
so-called coherence of risk measures.
Although VaR has become a standard risk measure for financial risk management,
it has been criticized for disregarding losses beyond the VaR level. To remedy this
difficulty with VaR, Artzner et al. (1997, 1999) propose five axioms that risk
measures need to satisfy in order to be coherent. Let XA and XB denote the
change in value of financial positions A and B, respectively, over a time horizon.
The risk measure () is said to be coherent if it satisfies the following
properties:
1. Monotonicity: ( ) ( )
2. Sub-additive: ( ) ( ) ( )
This is the most debated property. Necessary for following reasons:
Reflects idea that risk can be reduced by diversification and that a merger
creates no extra risk.
Makes decentralized risk management possible.
If a regulator uses a non-sub-additive risk measure, a financial institution could
reduce risk capital by splitting into subsidiaries.
Remarks:
VaR is in general not coherent. ES (as we have defined it) is coherent.
Non-sub-additivity of VaR is relevant in presence of skewed loss distributions or
if traders optimize against VaR.
( )
where Z1- is the )th quintile of the standard normal distribution and () is
the density function of the standard normal distribution, for which Z = Z1-
Similarly, for a short position.
( ) and ( ) ( )
In practice, both and are unknown and have to be estimated from past data.
For VaR calculations, the normal distribution is often replaced by a Student t -
distribution that has fatter tails.
The covariances i,j in (1) are the essential ingredients of portfolio risk.
Since they are unknown in practice, the task of estimating them from limited
historical data becomes increasingly difficult with an increasing number of
assets.
A widely used method to overcome this curse of dimensionality in VaR
models is to perform principal component analysis (PCA) to determine
whether 2 in (1) can be approximated by the sum of variances of a
relatively small number of principal components.
Example
Consider the seven weekly U.S. Treasury rates with maturities 1, 2, 3, 5, 7, 10, and
20 years from October 1, 1993 to March 23, 2007. Table 1 gives the factor
loadings and standard deviations of factor scores using the covariance matrix of
interest rate changes. Note that the first three principal components account for
about 99% of the variance in the data.
Table 1: PCA of seven U.S. Treasury rates. (Standard deviation, Proportion and
Factor loadings
Applications of PCA: An example
We assume that the returns are i.i.d. However, since risk may vary with time and
since the available data are time series of past returns, better VaR models can be
built by incorporating the time series properties of the returns data. In particular,
the linear time series models can be used to model the mean returns rt over time,
and the conditional heteroskedastic models can be used to model the time-varying
volatilities t, as in the following ARMA(p, q)-GARCH(h, k) model for (rt , t ):
where the are i.i.d. standard normal or standardized Student-t random variables.
The model can be estimated by maximum likelihood and implemented by the
garchfit function in the GARCH toolbox of MATLAB. The conditional
distribution of rt+1 given the information available at time t is [ t+1,t , t+1,t ]
where t+1,t and , t+1,t are given as
t+1,t = and
t+1,t =
[Abramowitz and Stegun, 1965] Abramowitz, M. and Stegun, I., editors (1965).
Handbook of Mathematical Functions. Dover Publications, New York.
[Acerbi and Tasche, 2002] Acerbi, C. and Tasche, D. (2002). On the coherence of
expected shortfall. J. Banking Finance, 26:14871503.
[Artzner et al., 1999] Artzner, P., Delbaen, F., Eber, J., and Heath, D. (1999).
Coherent measures of risk. Math. Finance, 9:203-228.
[Balkema and de Haan, 1974] Balkema, A. and de Haan, L. (1974). Residual life
time at great age. Ann. Probab., 2:792804.
[Black and Scholes, 1973] Black, F. and Scholes, M. (1973). The pricing of
options and corporate liabilities. J. Polit. Economy, 81(3):637654.
[Bluhm et al., 2002] Bluhm, C., Overbeck, L., and Wagner, C. (2002). An
Introduction to Credit Risk Modeling. CRC Press/Chapman & Hall, London.
[Cherubini et al., 2004] Cherubini, U., Luciano, E., and Vecchiato, W. (2004 .
Copula Methods in Finance. Wiley, Chichester.
[Crouhy et al., 2000] Crouhy, M., Galai, D., and Mark, R. (2000). A comparative
analysis of current credit risk models. J. Banking Finance, 24:59117.
[Crouhy et al., 2001] Crouhy, M., Galai, D., and Mark, R. (2001). Risk
Management. McGraw-Hill, New York.
[Daul et al., 2003] Daul, S., De Giorgi, E., Lindskog, F., and McNeil, A. (2003).
The grouped t-copula with an application to credit risk. Risk, 16(11):7376.
[Davis and Lo, 2001] Davis, M. and Lo, V. (2001). Infectious defaults. Quant.
Finance, 1(4):382387.
[Duffie and Singleton, 1999] Duffie, D. and Singleton, K. (1999). Modeling term
structures of defaultable bonds. Rev. Finan. Stud., 12:687720.
[Eberlein et al., 1998] Eberlein, E., Keller, U., and Prause, K. (1998). New insights
into smile, mispricing, and value at risk: the hyperbolic model. J. Bus., 38:371
405.
[Embrechts et al., 2002] Embrechts, P., McNeil, A., and Straumann, D. (2002).
Correlation and dependency in risk management: properties and pitfalls. In
Dempster, M., editor, Risk Management: Value at Risk and Beyond, pages 176
223. Cambridge University Press, Cambridge.
[Fisher and Tippett, 1928] Fisher, R. and Tippett, L. (1928). Limiting forms of the
frequency distribution of the largest or smallest member of a sample. Proc. Camb.
Phil. Soc., 24:180190.
[Follmer and Schied, 2004] Follmer, H. and Schied, A. (2004). Stochastic Finance
An Introduction in Discrete Time. Walter de Gruyter, Berlin New York, 2nd
edition.
[Frey and Backhaus, 2004] Frey, R. and Backhaus, J. (2004). Portfolio credit risk
models with interacting default intensities: a Markovian approach. Preprint,
University of Leipzig.
[Frey and McNeil, 2002] Frey, R. and McNeil, A. (2002). VaR a expected shortfall
in portfolios of dependent credit risks: Conceptual and practical insights. J.
Banking Finance, pages 13171344.
[Frey and McNeil, 2003] Frey, R. and McNeil, A. (2003). Dependent defaults in
models of portfolio credit risk. J. Risk, 6(1):5992.
[Glasserman and Li, 2003] Glasserman, P. and Li, J. (2003). Importance sampling
for portfolio credit risk. Preprint, Columbia Business School.
[Hawkes, 1971] Hawkes, A. (1971). Point spectra of some mutually exciting point
processes. J. R. Stat. Soc. Ser. B Stat. Methodol., 33:438443.
[Hull and White, 2004] Hull, J. and White, A. (2004). Valuation of a CDO and an
nth to default CDS without Monte Carlo simulation. J. Derivatives, 12:823.
[Jarrow and Yu, 2001] Jarrow, R. and Yu, F. (2001). Counterparty risk and the
pricing of defaultable securities. J. Finance, 53:22252243.
[Lando, 1998] Lando, D. (1998). Cox processes and credit risky securities. Rev.
Derivatives Res., 2:99120.
[Lando, 2004] Lando, D. (2004). Credit Risk Modeling: Theory and Applications.
Princeton University Press, Princeton.
[Laurent and Gregory, 2003] Laurent, J. and Gregory, J. (2003). Basket default
swaps, CDOs and factor copulas. Preprint, University of Lyon and BNP Paribas.
[McNeil et al., 2005] McNeil, A., Frey, R., and Embrechts, P. (2005). Quantitative
Risk Management: Concepts, Techniques and Tools. Princeton University Press,
Princeton.
.
[Merton, 1974] Merton, R. (1974). On the pricing of corporate debt: The risk
structure of interest rates. J. Finance, 29:449470.
[Ogata, 1988] Ogata, Y. (1988). Statistical models for earthquake occurrences and
residuals analysis for point processes. J. Amer. Statist. Assoc., 83:927.
[Reiss and Thomas, 1997] Reiss, R.-D. and Thomas, M. (1997). Statistical
Analysis of Extreme Values. Birkhauser, Basel.
[Risk Metrics-Group, 1997] Risk Metrics-Group (1997). Credit metrics technical
document.
[Robert and Casella, 1999] Robert, C. and Casella, G. (1999). Monte Carlo
Statistical Methods. Springer, New York.
[Scholes, 2000] Scholes, M. (2000). Crisis and risk management. Amer. Econ.
Rev., pages 17-22.
[Steinherr, 1998] Steinherr, A. (1998). Derivatives. The Wild Beast of Finance.
Wiley, New York.