You are on page 1of 21

QUANTITATIVE METHODS FOR FINANCIAL RISK

MANAGEMENT
By
Dr. BOUALEM BENDJILALI

ABSTRACT:

Risk management is one of the most important innovations of the 20th century.
Large derivatives losses and other financial incidents raised banks consciousness
of risk. Banks became subject to regulatory capital requirements, internationally
coordinated by the Basle Committee of the Bank of International Settlements.
Financial institutions have set up research departments to develop new quantitative
tools to deal with the issue .Our presentation will give light on some of the
quantitative tools that are used in the management of risk and in particular
financial risk management. Our presentation will have six sections. The first
section discusses the financial risk and its importance whereas the second section
presents the nature of the challenge .Section 3 discusses the types of financial risks
.Section 4 formulates the concept of the loss function .In section 5 we will
present quantitatively define the most important measures inherent to the risk
among them the Value at Risk (VaR) and the Expected Shorfall (ES). Finally,
section 6 presents briefly the importance of the time series models in risk
management.

OUTLINE

I. Financial Risk in Perspective

II. Quantitative Risk Management : The Nature of the Challenge

III. Types of Financial Risks

IV. Loss Distributions

V. Risk Measures : Value at Risk (VaR) and Expected Shortfall (ES)

VI. Time series models


I . Financial Risk in Perspective

What is a Risk/ Risk Management ?

Risk is hazard ,a chance of bad consequences ,loss or exposure to mischance. It


can be looked at any event or action that may adversely affect an organization or
firms ability to achieve its targets and execute its strategies. It can be viewed also
as the quantifiable likelihood of loss or less than expected returns.
Risk management is a logical process or approach that seeks to minimize the level
of risk associated with a business operation. It is a process that aims to help
organizations /firms and institutions understand, evaluate and take action on all
their risks with a view to increasing the probability of their success and reducing
the likelihood of failure.
In other words, risk management is the desire to find ways to manage the degree
of uncertainty that exists within any business enterprise. Once the business model
is understood, it is possible to identify the different risks that are present
throughout the process. As those risks are identified, they are analyzed for ways to
alter the process so that the end result is still achieved, but the degree of risk is
minimized or removed altogether.

Relationship between Risk and Randomness :

Risk relates to uncertainty and hence to the notion of randomness. Randomness


had eluded a clear, workable definition for centuries, until Kolmogorov offered an
axiomatic definition of randomness and probability in 1933.

Importance of Risk Management .

Steinherr (1998) said that Risk management: one of the most important
innovations of the 20th century.
The late 20th century saw a revolution on financial markets. It was an era of
innovation in academic theory, product development (derivatives) and
information technology.

Large derivatives losses and other financial incidents raised banks consciousness
of risk.
Banks became subject to regulatory capital requirements, internationally
coordinated by the Basle Committee of the Bank of International Settlements.
II. Quantitative Risk Management : The Nature of the
Challenge
The challenge is to put current practice onto a mathematical framework where,
concepts like profit and loss distributions, risk factors, risk measures, capital
allocation and risk aggregation are given formal mathematical definitions. It is
therefore important to know the interrelation between the different types of risks as
well as the different quantitative and non quantitative methods needed to treat the
subject.

Extremes Matter

Alan Greenspan in the joint Central Bank research conference in 1995 said that
From the point of view of the risk manager, inappropriate use of the normal
distribution can lead to an understatement of risk, which must be balanced against
the significant advantage of simplification. From the central banks corner, the
consequences are even more serious because we often need to concentrate on the
left tail of the distribution in formulating lender-of-last-resort policies. Improving
the characterization of the distribution of extreme values is of paramount
importance.

John Meriwether wrote in the Wall Street Journal in August 2000 With
globalization increasing, youll see more crises. Our whole focus is on the
extremes now - whats the worst that can happen to you in any situation - because
we never want to go through that [LTCM] again. We need models that capture
the related phenomena of heavy tails , volatility and extreme values.
[John Meriwether, The Wall Street Journal, 21st August 2000]
This led scholars to develop mathematical models that deal with phenomena
with extreme values.

The Interdependence and Concentration of Risks

The multivariate nature of risk presents an important challenge. Whether we look


at market risk or credit risk, or overall enterprise-wide risk, we are generally
interested in some form of aggregate risk that depends on high-dimensional vectors
of underlying risk factors such as individual asset values, market risk, or credit
spreads and counterparty default indicators in credit risk. A particular concern in
the multivariate modeling is the phenomenon of dependence between extreme
outcomes, when many risk factors move against us simultaneously.
Dependent Extreme Values:

The Business week newspaper (1998) wrote Extreme, synchronized rises and
falls in financial markets occur infrequently but they do occur. The problem with
the models is that they did not assign a high enough chance of occurrence to the
scenario in which many things go wrong at the same timethe perfect storm
scenario.

In a perfect storm scenario the risk manager discovers that the diversification he
thought he had is illusory; practitioners describe this also as a concentration of risk.

Concentration Risk
Scholes ( 2000) said in his article in the American Economic Review Journal
Over the last number of years, regulators have encouraged financial entities to use
portfolio theory to produce dynamic measures of risk. VaR, the product of
portfolio theory, is used for short-run, day-to-day profit and loss exposures. Now is
the time to encourage financial institutions and other regulatory bodies to support
studies on stress test and concentration methodologies. Planning for crises is more
important than VaR analysis. And such new methodologies are the correct
response to recent crises in the financial industry.

The Problem of Scale


A further challenge in QRM ( Quantitative Risk Management) is the typical scale
of the portfolios, which at their most general may represent the entire position in
risky assets of a financial institution. Calibration of detailed multivariate models
for all risk factors is an almost impossible task and hence any sensible strategy
involves dimension reduction, that is to say the identification of key risk drivers
and a concentration on modeling the main features of the overall risk landscape
with a fairly broad brush approach. This applies both to market risk and credit risk
models

Interdisciplinary
The quantitative risk manager of the future should have a combined skillset that
includes concepts, techniques and tools from many fields:
Mathematical finance;
Statistics and financial econometrics;
Actuarial mathematics;
Non-quantitative skills, especially communication skills;
III.Types of Financial Risks

Financial risks can be broadly classified into several categories, namely market
risk, credit risk, liquidity risk, operational risk, and legal risk.

Market risk Market risk is the risk of a change in the value of a financial
position due to changes in the value of the underlying components on which that
position depends, such as stock and bond prices, exchange rates, commodity
prices, etc. That is the risk of loss arising from changes in the value of tradable or
traded assets.
Credit risk is the risk of loss due to the failure of the counterparty to pay the
promised obligation.
Liquidity risk is the risk of loss arising from the inability either to meet payment
obligations (funding liquidity risk) or to liquidate positions with little price impact
(asset liquidity risk).
Operational risk is the risk of loss caused by inadequate or failed internal
processes, people and systems, or external events.
Legal risk is the risk of loss arising from uncertainty about the enforceability of
contracts.

What is an Insurance Risk ?


The insurance industry also has a longstanding relationship with risk.
Actuaries are respected professionals whose innovative approach to making
business successful is matched by a responsibility to the public interest.
Actuaries identify solutions to financial problems. They manage assets and
liabilities by analyzing past events, assessing the present risk involved and
modeling what could happen in the future. An additional risk category
entering through insurance is underwriting risk: the risk inherent in
insurance policies sold.
IV. Formulation of the Loss Distribution:
To model risk we use language of probability theory. Risks are represented by
random variables mapping unforeseen future states of the world into values
representing profits and losses. The risks which interest us are aggregate risks. In
general we consider a portfolio which might be :
a collection of stocks and bonds;
a book of derivatives;
a collection of risky loans;
a financial institutions overall position in risky assets

3.1 Portfolio Values and Losses

Consider a portfolio and let Vt denote its value at time t; we assume this random
variable is observable at time t. Suppose we look at risk from perspective of time t
and we consider the time period [t, t + 1]. The value Vt +1 of the portfolio at the
end of the time period is unknown to us. The distribution of (Vt+1 Vt) is known
as the profit-and-loss or P&L distribution. We denote the loss by

Lt+1 = (Vt+1 Vt) (1

By this convention, losses will be positive numbers and profits negative. We refer
to the distribution of Lt+1 as the loss distribution.

Risk Factors and mapping

Generally the value of the portfolio at time t will depend on time and a set of
observable risk factors Let Zt = (Zt,1 Zt,2 . . . Zt,n ) denotes the vector of risk
factors . Formally, we can write

Vt = f(t; Zt ) where f : R+ x Rn R (2

Represents a map from R+ x Rn onto R .This representation is termed mapping.


Examples for risk factors include logarithmic stock prices or index values, yields
etc..
Loss Distribution
The loss distribution is the distribution of Lt+1 . Let us denote the time series of
risk factor changes by the difference of the two vectors Xt+1 = Zt+1 - Zt .
Then the loss function can be written as

[ ( ) ( )] [ ( ) ( )] ( )

As of time t only random part is the risk factor change Xt+1. Hence loss distribution
is determined by the mapping f and by the distribution of risk factor change.
Sometimes we use a linearized version of (3).

[ ( ) ( ) ] ( )

Where subscripts denote partial derivatives and where is the risk management
horizon.

Example: Portfolio of Stocks

Consider n stocks; let i denote number of shares in stock i at time t and let St,i
denote the price. The risk factors: Following standard convention, the risk factors
may be taken by taking the logarithmic prices

Zt,i = Ln St,1 ; (5

The risk factor changes: in this case these are

X t+1 ,i = Ln St + 1,i - Ln St,i , (6

which correspond to the so-called log-returns of the stock. The Mapping

( )

The Loss becomes

[ ] ( ) ( )
Where is relative weight of stock i at time t. Here there is no
explicit time dependence in the mapping (7). The partial derivatives with respect to
risk factors are

( ) for

and hence the linearized loss ( Right hand side of equation ) will be equal to

( )

where is relative weight of stock i at time t. This formula may be


compared with (8).
V. Risk Measures : Value at Risk (VaR) and
Expected Shortfall (ES)

While the original Basel Accord focused primarily on the (credit) risks associated
with the issuer, the 1996 amendment sought to give more coverage to market risk.
In the amendment, there are certain regulatory requirements on the internal models
from which the banks calculate their capital requirements.
One requirement is that the risk management group in charge of the
development and execution of these models should be independent of the
business units it monitors and should report directly to senior management.
Another requirement is that besides calculating the regulatory capital
requirements, these models should be fully integrated into the banks risk
measurement and management, and back testing and stress testing should be
performed on their performance on a regular basis.

Discrete Returns
Let Pt denote the asset price at time t. Suppose the asset does not have
dividends over the period from time t to time t.
The one-period net return on this asset is Rt = (Pt Pt- )/Pt , and the one-
period gross return is Pt/Pt Rt .
The gross return over k periods is defined as

( ) (

and the net return over these periods is Rt(k). In practice, we usually use years
as the time unit. The annualized gross return for holding an asset over k years is
[ ( )] , and the annualized net return is . [ ( )]

Continuously compounded return (log return)

The logarithmic return or continuously compounded return on an asset is defined


as ( )
One property of log returns is that, as the change in time of a period
approaches , the log return is approximately equal to the net return:
( ) ( )

An n-period log return is the sum of n simple single-period log returns (the
additive of multi-period returns):

( ) ( )

Risk Measure : Value at Risk (VaR)

Risk measures attempt to quantify the riskiness of a portfolio. Most risk measures
are statistics of the loss distribution such as Variance , Value at Risk (VaR) or
Expected Shortfall (ES) .The most popular risk measure like VaR describe the
right tail of the loss distribution.

A risk measure is a number ( ) associated with a random variable X that


represents the loss of a financial position over a holding period. In particular,
for market risk, if the log return during the holding period is r, then X = r for
a short position and X = r for a long position.
Value at Risk is one of the most important and widely used risk management
statistics. It measures the maximum loss of a financial institutions position
due to market movements over a given holding period with a given level of
confidence.
let F be the distribution of X and let 0 < p < 1. The 100p% the value at risk
VaR is defined as

VaR( ) = inf { x : F(x) p },

which is ( ) if F is continuous and strictly increasing.


Measures of market risk Expected Shortfall (ES)

When the distribution function F of X is continuous, the expected shortfall (ES),


proposed by Artzner et al. (1997), is defined as the conditional expectation of loss
given that the loss falls beyond the 100(1 )% VaR level or, more precisely,

ES(p) = [ ( )] ( )

if F is continuous and is a regularized version of the conditional expectation for


general F, and which is coherent.

Advantages of ES

ES takes the whole tail of the distribution beyond VaR into account; in
particular ES > VaR.
ES has better properties regarding aggregation of risk. This is related to
so-called coherence of risk measures.

Fig 1: VaR in Visual Terms


Coherent risk measures

Although VaR has become a standard risk measure for financial risk management,
it has been criticized for disregarding losses beyond the VaR level. To remedy this
difficulty with VaR, Artzner et al. (1997, 1999) propose five axioms that risk
measures need to satisfy in order to be coherent. Let XA and XB denote the
change in value of financial positions A and B, respectively, over a time horizon.
The risk measure () is said to be coherent if it satisfies the following
properties:
1. Monotonicity: ( ) ( )
2. Sub-additive: ( ) ( ) ( )
This is the most debated property. Necessary for following reasons:
Reflects idea that risk can be reduced by diversification and that a merger
creates no extra risk.
Makes decentralized risk management possible.
If a regulator uses a non-sub-additive risk measure, a financial institution could
reduce risk capital by splitting into subsidiaries.

3. Positive homogeneity: ( ) ( ) b > 0.


4. Translational invariance: ( ) ( ) for any real number a.
5. Convexity. { XA (1- ) XB} (XA)+( 1- ) (XB) for all [0;
1]: A risk measure that satisfies monotonicity, translation invariance and
convexity is called a convex measure of risk

Remarks:
VaR is in general not coherent. ES (as we have defined it) is coherent.
Non-sub-additivity of VaR is relevant in presence of skewed loss distributions or
if traders optimize against VaR.

The Gaussian convention and the t-modification

The classical framework of i.i.d. normal returns provides a convenient framework


for VaR and ES calculations. For a portfolio consisting of p assets with daily
returns r1, . . . , rp and corresponding weights w1, . . . ,wp, the return of the portfolio
is and the mean and variance of the portfolio return are given
by

where i ; are the mean returns and

are the variances and covariances of these assets

The Gaussian convention and the t-modification

If all asset returns are jointly normally distributed and hence r N ( , ),


then the )% VaR of a long position over k days is

( )

and the )% ES of a long position over k days is



( ) ( )

where Z1- is the )th quintile of the standard normal distribution and () is
the density function of the standard normal distribution, for which Z = Z1-
Similarly, for a short position.


( ) and ( ) ( )

In practice, both and are unknown and have to be estimated from past data.
For VaR calculations, the normal distribution is often replaced by a Student t -
distribution that has fatter tails.

Applications of Principal Component Analysis

The covariances i,j in (1) are the essential ingredients of portfolio risk.
Since they are unknown in practice, the task of estimating them from limited
historical data becomes increasingly difficult with an increasing number of
assets.
A widely used method to overcome this curse of dimensionality in VaR
models is to perform principal component analysis (PCA) to determine
whether 2 in (1) can be approximated by the sum of variances of a
relatively small number of principal components.

Example
Consider the seven weekly U.S. Treasury rates with maturities 1, 2, 3, 5, 7, 10, and
20 years from October 1, 1993 to March 23, 2007. Table 1 gives the factor
loadings and standard deviations of factor scores using the covariance matrix of
interest rate changes. Note that the first three principal components account for
about 99% of the variance in the data.

Table 1: PCA of seven U.S. Treasury rates. (Standard deviation, Proportion and
Factor loadings
Applications of PCA: An example

Table 2: Change in portfolio value for 1-basis-point rate change.


Maturity (year) 1 2 3 5 7 10 20
6
Change ($10 8 4 8 3 2 1 2
Consider a portfolio with the exposures to interest rate moves shown in Table 2, in
which a 1-basis-point change in the 1-year (or 2-year, ..., or 20-year) rate causes
the portfolio value to increase by $8 (or 4, ..., or 2 ) million; 1 basis point = 0.01%.
From the results in Table 1, it follows that the exposure (measured in millions of
dollars) to the first principal component
f1 is equal to . . . . . . .
. .
The exposure to the second principal component f2 is
80.590 40.39880.239 30.04420.240 0.36320.495 = 3.427. The
change in the portfolio value can therefore be represented to a good approximation
by P = 0.259f1 3.427f2, and the standard deviation of P is therefore
( ) ( ) 24.61, where and 2 are the standard
deviations of f1 and f2, respectively. Hence, the 99% 1-week VaR is 24.61 2.33 =
57.34 (million dollars), assuming a normal distribution for P.
VI.Time series models

We assume that the returns are i.i.d. However, since risk may vary with time and
since the available data are time series of past returns, better VaR models can be
built by incorporating the time series properties of the returns data. In particular,
the linear time series models can be used to model the mean returns rt over time,
and the conditional heteroskedastic models can be used to model the time-varying
volatilities t, as in the following ARMA(p, q)-GARCH(h, k) model for (rt , t ):

where the are i.i.d. standard normal or standardized Student-t random variables.
The model can be estimated by maximum likelihood and implemented by the
garchfit function in the GARCH toolbox of MATLAB. The conditional
distribution of rt+1 given the information available at time t is [ t+1,t , t+1,t ]
where t+1,t and , t+1,t are given as

t+1,t = and

t+1,t =

Hence the )% 1-day VaR and ES of a long position are

t+1,t - t+1,t Z - and t+1,t - t+1,t (Z - )/

respectively, assuming normal t .


REFERENCES

[Abramowitz and Stegun, 1965] Abramowitz, M. and Stegun, I., editors (1965).
Handbook of Mathematical Functions. Dover Publications, New York.

[Acerbi and Tasche, 2002] Acerbi, C. and Tasche, D. (2002). On the coherence of
expected shortfall. J. Banking Finance, 26:14871503.

[Artzner et al., 1999] Artzner, P., Delbaen, F., Eber, J., and Heath, D. (1999).
Coherent measures of risk. Math. Finance, 9:203-228.

[Balkema and de Haan, 1974] Balkema, A. and de Haan, L. (1974). Residual life
time at great age. Ann. Probab., 2:792804.

[Barndorff-Nielsen, 1997] Barndorff-Nielsen, O. (1997). Normal inverse Gaussian


distributions and stochastic volatility modelling. Scand. J. Statist., 24:113.

[Barndorff-Nielsen and Shephard, 1998] Barndorff-Nielsen, O. and Shephard, N.


(1998). Aggregation and model construction for volatility models. Preprint, Center
for Analytical Finance, University of Aarhus.

[Black and Scholes, 1973] Black, F. and Scholes, M. (1973). The pricing of
options and corporate liabilities. J. Polit. Economy, 81(3):637654.

[Bluhm, 2003] Bluhm, C. (2003). CDO modeling: techniques, examples and


applications. Preprint, HVB Group, Munich.

[Bluhm et al., 2002] Bluhm, C., Overbeck, L., and Wagner, C. (2002). An
Introduction to Credit Risk Modeling. CRC Press/Chapman & Hall, London.

[Cherubini et al., 2004] Cherubini, U., Luciano, E., and Vecchiato, W. (2004 .
Copula Methods in Finance. Wiley, Chichester.

[Clayton, 1996] Clayton, D. (1996). Generalized linear mixed models. In Gilks,


W., Richardson, S., and Spiegelhalter, D., editors, Markov Chain Monte Carlo in
Practice, pages 275301. Chapman & Hall, London.
[Crosbie and Bohn, 2002] Crosbie, P. and Bohn, J. (2002). Modeling default risk.
Technical document, Moodys/KMV, New York.

[Crouhy et al., 2000] Crouhy, M., Galai, D., and Mark, R. (2000). A comparative
analysis of current credit risk models. J. Banking Finance, 24:59117.

[Crouhy et al., 2001] Crouhy, M., Galai, D., and Mark, R. (2001). Risk
Management. McGraw-Hill, New York.

[Daley and Vere-Jones, 2003] Daley, D. and Vere-Jones, D. (2003). An


Introduction to the Theory of Point Processes, volume I: Elementary Theory and
Methods. Springer, New York, 2nd edition.

[Daul et al., 2003] Daul, S., De Giorgi, E., Lindskog, F., and McNeil, A. (2003).
The grouped t-copula with an application to credit risk. Risk, 16(11):7376.

[Davis and Lo, 2001] Davis, M. and Lo, V. (2001). Infectious defaults. Quant.
Finance, 1(4):382387.

[Duffie and Singleton, 1999] Duffie, D. and Singleton, K. (1999). Modeling term
structures of defaultable bonds. Rev. Finan. Stud., 12:687720.

[Eberlein and Keller, 1995] Eberlein, E. and Keller, U. (1995). Hyperbolic


distributions in finance. Bernoulli, 1:281299.

[Eberlein et al., 1998] Eberlein, E., Keller, U., and Prause, K. (1998). New insights
into smile, mispricing, and value at risk: the hyperbolic model. J. Bus., 38:371
405.
[Embrechts et al., 2002] Embrechts, P., McNeil, A., and Straumann, D. (2002).
Correlation and dependency in risk management: properties and pitfalls. In
Dempster, M., editor, Risk Management: Value at Risk and Beyond, pages 176
223. Cambridge University Press, Cambridge.

[Fisher and Tippett, 1928] Fisher, R. and Tippett, L. (1928). Limiting forms of the
frequency distribution of the largest or smallest member of a sample. Proc. Camb.
Phil. Soc., 24:180190.
[Follmer and Schied, 2004] Follmer, H. and Schied, A. (2004). Stochastic Finance
An Introduction in Discrete Time. Walter de Gruyter, Berlin New York, 2nd
edition.
[Frey and Backhaus, 2004] Frey, R. and Backhaus, J. (2004). Portfolio credit risk
models with interacting default intensities: a Markovian approach. Preprint,
University of Leipzig.
[Frey and McNeil, 2002] Frey, R. and McNeil, A. (2002). VaR a expected shortfall
in portfolios of dependent credit risks: Conceptual and practical insights. J.
Banking Finance, pages 13171344.

[Frey and McNeil, 2003] Frey, R. and McNeil, A. (2003). Dependent defaults in
models of portfolio credit risk. J. Risk, 6(1):5992.

[Glasserman and Li, 2003] Glasserman, P. and Li, J. (2003). Importance sampling
for portfolio credit risk. Preprint, Columbia Business School.

[Gnedenko, 1943] Gnedenko, B. (1943). Sur la distribution limite du terme


maximum dune serie aleatoire. Ann. of Math., 44:423453.

[Gordy, 2003] Gordy, M. (2003). A risk-factor model foundation for ratings-based


capital rules. J. Finan. Intermediation, 12(3):199232.

[Greenspan, 2002] Greenspan, A. (2002). Speech before the Council on Foreign


Relations. In International Financial Risk Management, Washington, D.C. 19th
November.

[Hawkes, 1971] Hawkes, A. (1971). Point spectra of some mutually exciting point
processes. J. R. Stat. Soc. Ser. B Stat. Methodol., 33:438443.

[Hull and White, 2004] Hull, J. and White, A. (2004). Valuation of a CDO and an
nth to default CDS without Monte Carlo simulation. J. Derivatives, 12:823.

[Jarrow and Yu, 2001] Jarrow, R. and Yu, F. (2001). Counterparty risk and the
pricing of defaultable securities. J. Finance, 53:22252243.

[Joe, 1997] Joe, H. (1997). Multivariate Models and Dependence Concepts.


Chapman & Hall, London.
[Kealhofer and Bohn, 2001] Kealhofer, S. and Bohn, J. (2001). Portfolio
management of default risk. Technical document, Moodys/KMV, New York.

[Lando, 1998] Lando, D. (1998). Cox processes and credit risky securities. Rev.
Derivatives Res., 2:99120.
[Lando, 2004] Lando, D. (2004). Credit Risk Modeling: Theory and Applications.
Princeton University Press, Princeton.

[Laurent and Gregory, 2003] Laurent, J. and Gregory, J. (2003). Basket default
swaps, CDOs and factor copulas. Preprint, University of Lyon and BNP Paribas.

[Marshall and Olkin, 1988] Marshall, A. and Olkin, I. (1988). Families of


multivariate distributions. J. Amer. Statist. Assoc., 83:834841.

[McNeil, 1998] McNeil, A. (1998). History repeating. Risk, 11(1):99.

[McNeil et al., 2005] McNeil, A., Frey, R., and Embrechts, P. (2005). Quantitative
Risk Management: Concepts, Techniques and Tools. Princeton University Press,
Princeton.
.
[Merton, 1974] Merton, R. (1974). On the pricing of corporate debt: The risk
structure of interest rates. J. Finance, 29:449470.

[Ogata, 1988] Ogata, Y. (1988). Statistical models for earthquake occurrences and
residuals analysis for point processes. J. Amer. Statist. Assoc., 83:927.

[Pickands, 1975] Pickands, J. (1975). Statistical inference using extreme order


statistics. Ann. Statist., 3:119131.

[Reiss and Thomas, 1997] Reiss, R.-D. and Thomas, M. (1997). Statistical
Analysis of Extreme Values. Birkhauser, Basel.
[Risk Metrics-Group, 1997] Risk Metrics-Group (1997). Credit metrics technical
document.
[Robert and Casella, 1999] Robert, C. and Casella, G. (1999). Monte Carlo
Statistical Methods. Springer, New York.
[Scholes, 2000] Scholes, M. (2000). Crisis and risk management. Amer. Econ.
Rev., pages 17-22.
[Steinherr, 1998] Steinherr, A. (1998). Derivatives. The Wild Beast of Finance.
Wiley, New York.

[Schonbucher, 2003] Schonbucher, P. (2003). Credit Derivatives Pricing Models.


Wiley.
[Schonbucher and Schubert, 2001] Schonbucher, P. and Schubert (2001).
Copula-dependent default risk in intensity models. Preprint, Universitat Bonn.

[Smith, 1987] Smith, R. (1987). Estimating tails of probability distributions. Ann.


Statist., 15:11741207.

[Smith, 1989] Smith, R. (1989). Extreme value analysis of environmental time


series: an application to trend detection in ground-level ozone. Statist. Sci., 4:367
393.
[Steinherr, 1998] Steinherr, A. (1998). Derivatives. The Wild Beast of Finance.
Wiley, New York.
[Tavakoli, 2001] Tavakoli, J. (2001). Credit Derivatives and Synthetic Structures:
A Guide to Investments and Applications. Wiley, New York, 2nd edition.

You might also like