You are on page 1of 40

RiskMetrics Monitor

TM

J.P. Morgan/Reuters

First quarter 1997


New York
March 14, 1997

RiskMetrics News
INNOVA Financial Solutions offers RMOnline , a free, user-friendly internet application
that uses the RiskMetrics datasets.
Oracle Financial Services Consulting has been added to RiskMetrics third party listing.

Morgan Guaranty Trust Company


Risk Management Research
Peter Zangari
(1-212) 648-8641
zangari_peter@jpmorgan.com

Reuters Ltd
International Marketing
Martin Spencer
(44-171) 542-3260
martin.spencer@reuters.com

Research, Development, and Applications


On measuring credit exposure

RiskMetrics provides a framework to measure market risk, that is, the risk associated with
changes in market rates. However, the risk in a transaction depends not only on changes in
market rates but also on the credit standing of the counterparty to that transaction. A fundamental step towards measuring the risk in a transaction that is subject to default is the computation of credit exposure. The purpose of this article is to present three methodologies for
measuring the credit exposure of transactions whose mark-to-market value is a function of
current market rates. Such transactions include bonds, swaps and FX forwards.The first two
methodologies that we present provide credit exposure measures without relying on simulation and may be computed using the RiskMetrics methodology and data. The third
approach estimates credit exposure by simulating future rates.
The effect of EMU on risk management

23

On January 1, 1999, if the currently agreed on calendar is respected, a number of European


currencies will disappear into history and be replaced by a common monetary unit called the
Euro. The purpose of this article is to review how this will affect how the frameworks for
market risk management and how specific products such as RiskMetrics will be impacted by
the change.
Streamlining the market risk measurement process

29

In this note we describe a simple and effective approach for calculating Value-at-Risk (VaR)
that reduces some of the computational burdens confronting todays risk managers. We propose a general methodology to measure VaR that is based on what we refer to as the
portfolio aggregation principle.
Previous editions of the RiskMetrics Monitor

39

RiskMetrics Monitor
Fourth quarter 1996
page 2

RiskMetrics News
Scott Howard
Morgan Guaranty Trust Company
Risk Management Advisory
(1-212) 648-4317
howard_james_s@jpmorgan.com

INNOVA Financial Solutions

http://www.ifs.dk/RMOnline/RMOnline.html

RMOnline is a free-to-use and very user-friendly internet application that uses the full RiskMetrics
daily datasets. RMOnline can be used with any web-browser that knows "tables" and "forms". RMOnline requires a log-on (you can be anonymous) because it stores your last portfolio entered for reuse
the next time you log-on.
Features of RM Online
Works with and uses the RiskMetrics daily datasets provided by J.P. Morgan and Reuters.
It is based on the RiskMetrics methodology and measures the market risk of a given portfolio.
It supports interest and foreign exchange rates and commodity and equity prices for 31 countries
plus the XEU.
Supports non-USD based portfolios.
You work with your portfolio on the server.

Oracle Financial Services Consulting, Risk Management Practice


520 Madison Avenue, 29th floor, New York, NY 10022<br>
Margaret Paterson ((212) 508-7985, FAX (212) 508-7958
Oracle Consulting is organised into geographic, functional, and industry groups. Geographically we
operate in over 70 countries globally. Functional groups provide our clients with experience and
knowledge in specific areas such as financial applications, business process re-engineering, performance optimization, open systems transformations, IS Strategy and data warehouse implementations.
There are over 7000 professional consulting personnel whose key responsibilities are helping clients
leverage our technology to implement business solutions provided by Oracle and our business partners
such as.The risk management practice within Financial Services consulting offers project leadership
and business expertise in all aspects of risk including market, liquidity, credit, legal, and operational.
This team has long standing expertise in all aspects of risk management systems development with an
averqage of over 10 years in the business. Asssitance in the implementation and customization of the
Infinity product from any business requirement can easily be handled by this group.

RiskMetrics Monitor
First Quarter 1997
page 3

On measuring credit exposure


Peter Zangari
Morgan Guaranty Trust Company
Risk Management Research
(1-212) 648-8641
zangari_peter@jpmorgan.com

RiskMetrics provides a framework to measure market risk, that is, the risk associated with changes
in market rates. However, the risk in a particular transaction depends not only on changes in market
rates but also on the credit standing of the counterparty to that transaction. For example, when two parties enter into an interest rate swap, the risk of that swap to a particular party depends on two factors:
1
(1) the potential changes in swap rates and (2) whether or not the counterparty will default prior to the
swaps maturity. A fundamental step towards measuring the risk in a transaction that is subject to default is the computation of credit exposure. The credit exposure in a particular transaction is the nominal amount that can be lost when a counterparty defaults on its obligations. Note that credit exposure
is not a risk measure but rather an amount that when combined with other information (e.g., the likelihood of default) can provide a measure of credit risk.
The purpose of this article is to present three methodologies for measuring the credit exposure of transactions whose values have been marked-to-market. The first two methodologies that we present provide
credit exposure measures without relying on simulation and may be computed using the RiskMetrics
methodology and data. The third approach estimates credit exposure by simulating future rates.
In order to facilitate the exposition of measuring credit exposure, this article focuses exclusively on the
credit exposure of plain vanilla interest rate (IR) swaps. However, the reader should understand that the
general principles explained below apply to any instrument whose cashflows can be identified and
marked-to-market. This rest of the article is organized as follows:
In section 1, we describe the relationship between an IR swaps market value and credit exposure. We identify two types of exposurecurrent and potential. Whereas current exposure is
simply a function of the mark-to-market value of a swap, potential exposure depends on the
values of future swap rates as well as the mark-to-market value. Measures of potential exposure can be classified into worst case and expected measures.
Section 2 provides the theory and computational details of two analytic (non-simulation
based) approaches for measuring potential credit exposure.
- Section 2.1 presents a statistical approach to measuring potential exposure. This method
relies on RiskMetrics methodology and data (volatilities and correlation) and applies the
normal probability model of transactions value to measure exposure. Sections 2.1.1 and
2.1.2 show how to compute worst case (maximum and peak) and expected (expected and
average) exposures, respectively, and section 2.1.3 explains some practical issues involving
the calculation of these exposures. Finally, section 2.1.4 shows how to compute the potential exposure of a portfolio of swaps.
- Section 2.2 reviews the calculation of potential exposure that is based on standard option
pricing theory.
Section 3 presents a full simulation methodology for measuring credit exposures. We use this
model to estimate the credit exposure of IR swaps and compare these results to those provided
by the analytic approaches.
Section 4 offers concluding remarks.

It should be noted that time also affects exposure calculation.

RiskMetrics Monitor
First Quarter 1997
page 4

On measuring credit exposure (continued)

1. The relationship between credit exposure and market value


Traditionally, there are two types of credit exposure. Current exposure is the exposure based on a transactions mark-to-market value. If a transaction has a positive market value to a given party then its current exposure will be equivalent to its market value since if a counterparty defaults, the mark-to-market
value of the transaction is assumed lost. On the other hand, the current exposure of a transaction that
has a negative (or zero) mark-to-market value is zero. This follows from the fact that if a party owes
money at the time its counterparty defaults, its loss is zero. For IR swaps, current exposure is the cost
of replacing a swap at current market rates. This cost is often referred to as the replacement cost.2
Potential exposure is the credit exposure that may arise in the future when interest rates change. Consequently, and unlike current exposure, the best a risk manager can do is estimate potential exposure
given some model on how rates and prices evolve over time. Popular measures of potential exposure
include maximum, peak, expected and average exposure.

1.1 Current exposure


Suppose a Bank and a Company enter into a simple IR swap arrangement where the Bank is receiving
payments according to some fixed rate (fixed-rate receiver) and is required make payments according
to, say, the 6-month LIBOR rate. If we are currently at time t (t is known as the analysis date), V t denotes the swaps mark-to-market value (e.g., the present value of the swap given current market rates).
From the Banks perspective V t is defined as follows
[1]

V t = V t ( fixed side ) V t ( floating side )

where V t ( fixed side ) and V t ( floating side ) are the mark-to-market values of the fixed and floating
sides of the swap at time t, respectively. Now, at time t the Bank faces the possibility of credit loss only
if it is owed money from its counterpart, the Company. In other words, the Bank has current credit exposure only if V t is greater than zero, which would require that the mark-to-market value of the receipts
based on the fixed rate is greater than the mark-to-market payments based on the floating rate, i.e.,
V t ( fixed side ) > V t ( floating side ) . When such a scenario exists, the swap is known to be in-the-money to the Bank and the current exposure is given by the difference between V t ( fixed side ) and
V t ( floating side ) . Alternatively, if the swap is at- or out-of-the-money at time t, i.e.,
V t ( fixed side ) V t ( floating side ) , then the Banks current exposure is zero. This follows from the
fact that the Bank would be a net payer if the Company were to default at time t.
We can generalize the relationship between current exposure and market value as follows. Let E t denote the current exposure of a particular transaction at time t. Current credit exposure is defined in terms
of the mark-to-market value of a transaction by the following relationship
Et = V t
if V t > 0
[2]
Et = 0
if V t 0
Eq.[2] can also be written as
E t = max ( V t, 0 )
[3]
where max(a,b) returns the maximum of a and b.

A similar definition is given by Smithson, Smith and Wilford, (p, 436, 1995), who write, the current replacement cost
indicates the cost of replacing a counterparty if the counterparty defaults today.

RiskMetrics Monitor
First Quarter 1997
page 5

On measuring credit exposure (continued)

1.2 Potential exposure


Potential exposure is the result of future changes in underlying prices and rates that affect the value of
a particular transaction. That is to say, potential exposure calculations recognize the probability distribution of underlying financial prices. In addition, and as will be shown in more detail below, potential
exposure is a function of the time that exposure is measured.
For example, suppose it is 6 months after the Company has agreed to pay the Bank a fixed rate of 6.40%
for 6 month LIBOR. Assume that the swaps maturity is 3 years. If swap rates have fallen since the
trade:
the value of the swap to the Company (fixed rate payer) would be below its original purchase
price since the Company would be paying a fixed rate that is above current market rates.
For the Bank, the value of the swap would have increased since it would be receiving an above
market rate.
The situation is reversed for rate increases.
Table 1 depicts the relationship between the potential exposure of fixed (floating) rate payers
(receivers) in a IR swap and future changes in interest rates.
Table 1
Relating IR swap value & potential exposure to future changes in interest rates
Arrows signify increases and decreases in potential exposure and value

Pay fixed
(Receive floating)
Receive fixed
(Pay floating)

Future increase
in interest rates

Future decrease
in interest rates

Swap value
& exposure

Swap value
& exposure

Swap value
& exposure

Swap value
& exposure

Continuing with the hypothetical swap arrangement between the Bank and Company, and referring to
Table 1, lets analyze the potential exposure from the Companys perspective. Since the Company is
currently the fixed-rate payer, if interest rates were to increase in the future, then
the swap value increases to the Company as it will be paying a below market rate
the Companys exposure increases since if the Bank defaults, it may be forced into entering a
new contract where it will pay a higher fixed rate. That is, the replacement cost to the
3
Company under default has increased.
There are two important points to be taken from Table 1. First, future interest rate scenarios affect
exposure and market value in the same way. Second, a change in interest rates affect the two parties of
a swap in an offsettingthough not equalmanner.
Risk managers often focus on two measures of potential exposureworst case and expected. Worst
case measures provide estimates of exposure in terms of future values. Measures of this type include
maximum and peak exposure. Expected measures estimate credit exposure in terms of present and future value. The exposure that exists at any point in time in the future is referred to as expected exposure.

There is a third result: there is a higher likelihood of default by the Bank who is paying floating.

RiskMetrics Monitor
First Quarter 1997
page 6

On measuring credit exposure (continued)

In practice, we can compute expected exposure at different points in the future over the life a transaction. The weighted4 present value of these exposures is known as average exposure.

1.2.1 Worst case measures of credit exposure (maximum and peak exposure)
We define maximum exposure at a particular point in time as the 95th percentile of the distribution of
values of outstanding transactions at that time. In the case of swaps, should a counterparty default, there
is only a 5% chance of having to pay more than this amount to replace the outstanding swap.
Maximum exposure is an important measure of credit exposure because it can be used to determine how
much credit to allocate for transactions against a general counterparty (credit allocation function). Risk
managers may also use maximum exposure for credit risk control. For example, risk managers may
want to identify those transactions whose current exposure is greater than the maximum exposure that
was defined when the transaction originated.
A by-product of maximum exposure is peak exposure. Peak exposure is the maximum of all maximum
exposures over a specified time interval. Peak exposure is a useful measure of credit exposure because
is tells risk managers the time in the future when the largest losses are expected given that a counterparty defaults.

1.2.2 Expected measures of credit exposures (expected and average expected exposure)
Expected exposure measures the amount, on average, that will be lost if a default occurs. We compute
expected exposures at several different points in the future over the life of a transaction. These points
are known as sampling times. Below, we will use the letter i to denote sampling times where i =0,...,N
and there are a total of N+1 sampling times. Note that i = 0 corresponds to the current time. Hence, if
the current time is t, and there are 6 sampling times, we know that exposure will be calculated at times
t, t+1, t+2, t+3, t+4, t+5 where the exposure measured at time t is simply the current exposure.
Given a series of expected exposures, average exposure is the average of all expected exposures. Since
averaging is performed over time, care must be taken to weight each expected exposure by the appropriate discount factor.

2. Potential credit exposure calculations using analytic methods


Now that we have an overview of the issues, we present the mathematical details behind the credit exposure calculation and do so in the context of a specific example. Suppose that the details of the swap
arrangement between the Bank and Company are as presented in Table 2.

The weights correspond to different discount factors to account for averaging over time.

RiskMetrics Monitor
First Quarter 1997
page 7

On measuring credit exposure (continued)

Table 2
Swap description
Trade date:
Maturity date:
Notional principal
Fixed-rate payer:
Fixed rate:
Fixed-rate receiver
Floating rate
Reset dates:
LIBOR determination:

January 24, 1997


January 24, 2000
US $10 million
Company
6.40%
Bank
6-month LIBOR (money market basis)
July 24 and January 24 of each year
determined in advance, paid in arrears

Chart 1 depicts the swap arrangement between the Bank and Company.

Chart 1
Swap cash flows
6.40%
Company

6 month Libor

Bank

The swap description tells us that the Bank and Company enter into a three year par swap with a notional value of USD 10 million beginning January 24, 1997. In the following discussion we treat January 24, 1997 as the current time and denote it by t. The Company will pay the Bank a fixed annualized
rate of 6.40% on a semi-annual basis and will receive from the Bank payments that are based on the 6
5
month LIBOR rate. Table 3 presents the cashflows generated by the swap from the Companys perspective. Note that the cashflows based on the floating rate were generated using the forward 6 month
LIBOR curve.
Table 3
3 year 10mm USD interest rate par swap
Fixed rate = 6.40%; Semi-annual payments; Paying fixed side
Date

Time (yrs)

24-Jan-97
24-Jul-97
24-Jan-98
24-Jul-98
24-Jan-99
24-Jul-99
24-Jan-00

0
0.5
1.0
1.5
2.0
2.5
3.0

6 mo LIBOR
(%)

5.679
6.252
6.371
6.642
6.692
6.878

Fixed payment

Floating receipt

0
320,000
320,000
320,000
320,000
320,000
320,000

0
283,970
312,946
318,513
332,287
334,587
343,803

Value
(receipt - payment)
0
-36,030
-7,054
-1,486
12,287
14,587
23,804

Table 3 shows that given the forward curve, the Company expects to make net payments to the Bank
for the first year and a half of the swap and then receive net payments afterwards.

See the Appendix for the proper convention used to compute semi-annual fixed and floating payments. Throughout this
article we simplify the analysis and assume that the semi-annual basis is 0.5.

RiskMetrics Monitor
First Quarter 1997
page 8

On measuring credit exposure (continued)

The swap has a total of 6 semi-annual periods when cashflows are generated. Now, to compute exposures we need to establish sampling times, i.e., dates when exposures are measured. In the following
analysis, sampling times are placed immediately after each of the exchange of cashflows as well as immediately after the trade date (i=0). It is important to note that while these sampling times are equally
spaced apart, this need not be the case in general. The number and placement of sampling times is arbitrary. However, while the number and placement of the sampling times is arbitrary, the effect on exposure measures is not insignificant. In other words, exposure measures are sensitive to the number and
location of sampling times. Chart 2 shows a timeline of the swaps cashflows and sampling times.
Chart 2
Sampling times and cashflows of 3 year IR swap
Arrows denote sampling times: black line for cashflows
t

t+1

t+2

t+3

t+4

t+5

Sampling times
Time (years) 0.0

0.5

1.0

1.5

2.0

2.5

3.0

The timeline consists of 6 sampling times (gray arrows) at which credit exposure is measured. The
black lines denote times when cashflows are generated.

2.1 Statistical approach


*
Assume that at each sampling time, t+i, (i=0,..,5) the present value of an outstanding transaction, V t + i ,
*
is normally distributed with mean t + i and standard deviation t + i . V t + i represents the present value
of cashflows generated between time t+i and the end of the swap. That is to say, the cashflows are discounted back to time t+i. We can write the value of the transaction explicitly as a random variable
*

V t + i N ( t + i, t + i )

[4]

Chart 3 shows the typical normal curve representing the distribution of V t + i with mean t + i and standard deviation t + i .
Chart 3
*
Normal PDF of V t + i
PDF
0.400
0.350
0.300
0.250
0.200

t+i

0.150
0.100
0.050
0
t+i

V*t+i

From our earlier discussion we know that credit exposures at any sampling time, E t + i , are positive only
when the value of the underlying transaction at t+i is in-the-money. It follows that credit exposures can
*
be modeled in terms of V t + i as

RiskMetrics Monitor
First Quarter 1997
page 9

On measuring credit exposure (continued)

if V t + i > 0

Et + i = V t + i

[5]

Et + i = 0

if V

*
t+i

Since V t + i has a continuous probability distribution, E t + i is a mixture of continuous (when


*
E t + i = V t + i ) and discrete (when E t + i = 0) parts. In fact, the distribution of exposures is that of a
*
censored normal distribution. It is censored since all values of V t + i 0 translate into E t + i = 0.
Chart 4 shows a plot of the probability density function of E t + i
Chart 4
The distribution of exposures E t + i

PDF

t+i
*

The spike that occurs at 0 results from changing all negative values of V t + i to zero. In the discussion
that follows we will be interested in the mean of the distribution of exposures, E t + i , denoted E t + i .
Given this framework we can now provide exact expressions for the worst case and expected credit exposure measures.

2.1.1 Calculating maximum and peak exposures


Maximum exposure ( ME t + i ) at sampling time t+i is an estimate of the maximum credit exposure
given that there is a 5% chance that the realized loss is actually greater. Alternatively expressed, in the
case of default by a counterparty, there would be a 5% chance of having to pay more than this amount
to replace the outstanding transaction. Mathematically, ME t + i for a 95% confidence interval is given
by the expression
[6]

ME t + i = max ( 0, t + i + 1.65 t + i )

See the Appendix for a derivation of the maximum exposure estimate.


Given a set of maximum estimates at each of the N+1 sampling times, peak exposure ( PE t ) is simply
the maximum of the maximum exposures, that is,
[7]

PE t = max(ME t, ME t + 2, , ME t + N )

RiskMetrics Monitor
First Quarter 1997
page 10

On measuring credit exposure (continued)

2.1.2 Calculating expected and average exposures


Recall that expected exposure at any point in time in the future measures how much, on average, one
can expect to lose given a default by its counterparty. The expected exposure, denoted, E t + i , is simply
the expected value of the exposure distribution which was presented in Chart 4. The mathematical
expression for expected exposure at time t+i is
[8]
E t + i = ( ) + [ 1 ( ) ]
t+i

where
t + i and

t+i

t+i

t+i

t+i

t+i

t+i

are the mean and standard deviation of the value of outstanding transactions.

( t + i t + i ) is the standard normal pdf evaluated at t + i t + i


is the normal cumulative distribution function evaluated at

t+i t+i
t + i t + i

A complete derivation of Eq.[8] is given in the Appendix. We can use Eq.[8] to compute a set of
expected exposures at different sampling times. Having computed these exposures we can compute the
average exposure ( AE t ) which is defined as
N

[9]

AE t =

t + i Et + i

t=0

where the weights t + i used to discount the exposures are defined as


[10]

[ t, t + i ]
t + i = --------------------------N

[ t, t + i ]

i=0

The weights t + i depend on discount factors [ t, t + i ] which determine the present value at time t of
cashflows occurring at sampling times t+i.

2.1.3 Computing potential exposures: practical issues


In practice, calculating the four aforementioned credit exposure measuresmaximum, peak, expected
and average exposurerequire expressions for the mean and standard deviation of the distribution of
*
V t + i . We define the mean and standard deviation as follows:
t + i = V t + i
[11]
t + i = i [ i, T ]
where
V t + i is the forward value of the transaction, i.e., it is the value of cashflows generated between time
t+i and the maturity of the swap, discounted back to time t+i.
[ i, T ] is the daily standard deviation of weighted returns on a portfolio that generates cashflows
between time t+i and the swaps maturity, T. The weights are given by the forward values of the cashflows that are generated between time t+i and T.
i is the number of days between times t and t+i. That is, it is the number of days6 in period i.

RiskMetrics Monitor
First Quarter 1997
page 11

On measuring credit exposure (continued)

The calculation of t + i is a function of the sampling time and the time of the final cashflow. For
example, referring to the 3 year par swap, suppose the sampling time is one-year, t+2. In this case,
t + 2 = 2 [ 2, T ] where 2 is the number of business days in 1 year (252) and [ 2, T ] is the standard
deviation of returns on a swap that has a maturity of two years. Note that [ 2, T ] is a function of the
volatilities, correlations and cashflows generated by the swap between t+2 and T. In other words, it is
the RiskMetrics daily VaR estimate as if the current analysis date was one year forward divided by 1.65.
In the preceding example, we find that if the time horizon is one-year (the t+2 sampling time), then the
volatility we are interested in is that of a two year swap since the last cashflow occurs two years after
the t+2 sampling time. Chart 5 shows the relationship between sampling times and the required volatility estimates for the 3 year USD par swap.
Chart 5
Relationship between sampling time and swap maturity
1 year
2 year
3 year

t+1

t+2

t+3

t+4

t+5

The arrows above the sampling times represent the difference between the swaps maturity and sampling time. Note that as the sampling time increases, the maturity of the swap whose volatility is required decreases.
Now, lets examine the credit exposure calculation from the Companys perspective. The first step in
measuring potential exposure is the calculation of the forward value of swaps cashflows. Table 4 provides the Companys mark-to-market value of the swap and volatility at five sampling times, with each
time occurring six-months apart.
Table 4
Companys credit exposure parameters
3 year USD IR swap
Sample time (i)
0
1
2
3
4
5

Forward value

Vt + i
0
36,030
44,211
47,105
36,383
23,012

Time horizon Volatility

i
t+i

126
179,406
252
194,565
378
168,115
504
110,674
630
22,036

Notice that the swaps forward value to the Company at each sampling time is zero or positive. In order
to compute the swaps forward value at different sampling times, we were required to compute the for6

Note that when i=0 we simply compute current exposure.

RiskMetrics Monitor
First Quarter 1997
page 12

On measuring credit exposure (continued)

ward discount curves at each sampling time. The forward discount rates used to compute the market
value of the swap at each of the sampling times are presented in Table 5.
Table 5
Forward discount rate
Used to compute V t + i in percent
Sample times (i)
Date
24-Jan-97
24-Jul-97
24-Jan-98
24-Jul-98
24-Jan-99
24-Jul-99
24-Jan-00

Time
(yrs)
0
0.5
1.0
1.5
2.0
2.5
3.0

1
97.23
94.29
91.37
88.44
85.57
82.73

96.96
93.97
90.95
88.01
85.08

96.91
93.79
90.76
87.74

96.78
93.68
90.54

96.76
93.54

96.67

Table 5 highlights how discounting is performed in the credit exposure model. At each sampling time
future cashflows are discounted back to the sampling time rather than the current time.
Having computed the market value of the swap and volatility at each sampling time, the next step is to
compute the expected, maximum and peak exposures. Table 6 provides estimates of these exposures at
each sampling time.
Table 6
Companys expected, maximum and peak exposures by statistical approach
3 year USD par swap
Forward value

Expected
exposure

Vt + i

E t + i

0
36,030
44,211
47,105
36,383
23,012

0
91,026
101,721
93,236
64,709
24,698

Sample time (i)

1
2
3
4
5

Maximum
exposure @ 95%
ME

Peak exposure

t+i

0
332,050
365,423
324,494
218,995
54,373

365,423

Notice how the expected and maximum exposures start off small and increase until they reach a peak
(at sampling time t+2), and then decrease as the sampling time nears the swaps maturity. The swaps
credit exposure evolves in such a manner because of two factors: (1) volatility, t + i , scales with time
(through i ) and (2) there are less future cashflows generated by the swap as the sampling time increases. The result is the classic humped shaped profile of the expected and maximum exposures which is
presented in Chart 6.

RiskMetrics Monitor
First Quarter 1997
page 13

On measuring credit exposure (continued)

Chart 6
Companys expected and maximum exposure profile
3 year USD par swap; exposure is measured as percent of notional
Exposure (percent of notional)
4.5%
4.0%

Maximum

3.5%
3.0%
2.5%
2.0%
1.5%

Expected

1.0%
0.5%
0.0%
0

Sampling times

As anticipated, the maximum exposure lies above expected exposure. Finally, we can use the spot discount curve at the current time along with the expected exposures to compute average exposure. Table
7 provides the details for the average exposure calculation.
Table 7
Companys average exposure calculation
3 year USD par swap
Sample time (i)

1
2
3
4
5

Spot discount
rates (%)
1
97.23
94.29
91.37
88.44
85.57

Discount weights

E t + i t + i

0.179
0.1746
0.1693
0.1640
0.1588
0.1536

0
15,893
17,221
15,298
10,275
3,795

t + i

Average exposure

62,484

We can see from Table 7 that as of January 24, 1997 the Company has an average exposure of
USD62,484.
Now, suppose that in addition to the 3 year par IR swap, the Bank and Company on January 24, 1997
also enter into a 4 year par IR swap where the Bank pays a fixed rate of 6.53%. Table 8 shows the
cashflows generated by this swap from the Companys perspective (receiving fixed).

RiskMetrics Monitor
First Quarter 1997
page 14

On measuring credit exposure (continued)

Table 8
4 year 10mm USD interest rate par swap
Fixed rate = 6.53%; Semi-annual payments; Receiving fixed side
Date
24-Jan-97
24-Jul-97
24-Jan-98
24-Jul-98
24-Jan-99
24-Jul-99
24-Jan-00
24-Jul-00
24-Jan-01

Time
(yrs)
0
0.5
1.0
1.5
2.0
2.5
3.0
3.5
4.0

6 mo LIBOR
(%)

5.679
6.252
6.371
6.642
6.692
6.878
6.899
7.04

Fixed receipt
0
326,500
326,500
326,500
326,500
326,500
326,500
326,500
326,500

Floating
payment
0
283,970
312,946
318,513
332,287
334,587
343,803
345,307
343,803

Value
(receipt - payment)
0
42,530
13,553
7,986
-5,787
-8,087
-17,303
-18,807
-25,938

Note that given the forward curve as of January 24th, the Company could expect cash inflows for the
first 1 1/2 years and then after that expect to make payments to the Bank. Table 9 presents the forward
values of the swap from the Companys perspective.
Table 9
Forward value calculation
4 year USD swap
Sampling time (i) Forward value

Vt + i
0
1
2
3
4
5
6
7

0
-42,281
-57,415
-67,229
-63,676
-57,718
-42,399
-25,055

Table 9 shows that the swaps market value from the Companys point-of-view is negative at each sampling time. The fact that all of the forward values in Table 9 are negative may seem unintuitive given
that the value of the swap based on the 6 month LIBOR forward curve presented in Table 8. However,
recall that the sampling times are placed immediately after each exchange of cashflows and therefore,
the first forward value of USD (42,281) does not take into account the positive value of the swap at July
24, 1997.
The negative forward values translate into zero current exposure because the Company does not expect
to be a net receiver of payments from the Bank at any sampling time. For the Company, the average
and peak exposures for the 4 year USD swap are USD57,084 and USD453,332 (at t+3), respectively.

2.1.4 Measuring credit exposure of a portfolio of swaps


The focus of the discussion so far has been on the measurement of a single transactions (counterparty)
credit exposure. Now, we present a simple approach to measure a portfolios credit exposure. We dem-

RiskMetrics Monitor
First Quarter 1997
page 15

On measuring credit exposure (continued)

onstrate this approach by computing the credit exposure of the Company that holds both the 3 and 4
year par swaps.
The simplest, but potentially most misleading method for measuring credit exposure of a swap portfolio
would be to aggregate the credit exposures computed above. In this case, the average exposure for the
Company is USD119,568 (USD57,084 + USD62,484). Note that since peak exposure is calculated in
terms of future values, it is not obvious how to report a peak exposure estimate for the portfolio of the
two swaps since the peak exposures for the 3 year and 4 year par swaps occur at different sampling
times, t+2 and t+3, respectively.
An alternative, and more appealing approach to measure the portfolios credit exposure is to apply net7
ting. There are various definitions of netting, but for our purposes we will focus on what is often
8
referred to as bilateral netting. This is where, for any given counterparty, positive market values are
offset against negative market values at each sampling time. Naturally, we would expect such an approach to reduce average exposures relative to simple aggregation.
Table 10 presents the swap portfolios forward value at each sampling time. These market values were
computed by first netting the swaps cashflows and then finding their present value at each sampling
time. Note that the sampling times of the portfolio coincide with those of the longest maturity swap (4
years).
Table 10
Portfolio forward values and exposures by statistical approach
Companys swap portfolio
Sample time (i)

0
1
2
3
4
5
6
7

Forward value

Expected
exposure

Vt + i

E t + i

ME

0
-6,500
-13,203
-20,123
-27,292
-34,705
-42,399
-5,055

0
29,727
41,835
49,315
54,038
62,165
35,441
2,337

0
129,468
185,940
223,304
248,984
289,125
181,000
17,958

Maximum
exposure @ 95% Peak exposure
t+i

289,125

The negative market values imply that the negative cashflows generated by the 4 year swap dominate
the positive cashflows of the 3 year swap. As a result, the portfolio has zero expected exposure at each
sampling time. Since this is much less than simply adding the expected exposures at each sampling
time, we find that netting can have a significant impact on the credit exposure estimate.
The average and peak exposures for this netted swap portfolio are USD34,128 and USD289,125,
respectively. Not only is the average estimate based on netting lower than the aggregation approach,
but now it is straightforward to compute peak exposure.

7
8

See, for example, Smithson, Wilford and Smith (1995).


It is important to note that netting is only appropriate in those jurisdictions where it is legal to net swap payments in the
event of bankruptcy.

RiskMetrics Monitor
First Quarter 1997
page 16

On measuring credit exposure (continued)

In general, we can apply bilateral netting in a portfolio that consists of many counterparties. For example, suppose the Company enters into numerous swap arrangements with, say, three different banks. In
such a situation, the company would compute its credit exposure on a bilateral basis by first splitting
swap arrangements into three groups depending upon the swaps counterparty, second, net all cashflows
within each group, third, compute credit exposure measures following the methodology presented
above.

2.2 Option pricing approach


Recall from Eq.[3] that we defined current exposure at time t as the maximum of the value of transactions V t and 0. We can generalize this expression to hold at each sampling time so that we can define
exposure as
[12]

E t + i = max ( V t + i, 0 )

In words, Eq.[12] states that the exposure at t+i is the maximum of the forward value of transactions at
time t+i and 0. In general, we can define V t + i as consisting of the difference between assets (inflows)
and liabilities (outflows). If we let A t + i and L t + i represent the assets and liabilities at t+i, respectively,
then we have V t + i = A t + i L t + i so that
[13]

E t + i = max ( A t + i L t + i, 0 )

The reader may notice the similarity between Eq.[13] and the intrinsic value of a call option where A t + i
is the price of the underlying and L t + i is the strike price. The key difference between Eq.[13] and a
simple options intrinsic value is that L t + i can be random. Using the results provided by Margrabe
9
(1978) , it can be shown that the expected value of Eq.[12], which yields the expected exposure measure, is given by
[14]
where

E t + i = A t + i ( d 1 ) L t + i ( d 2 )
2

t+i
log ( A t + i L t + i ) + ----------- i
2
d 1 = ----------------------------------------------------------------------- t + i i
d 2 = d 1 t + i i

and t + i is the daily volatility (in percent) that takes into account that both A t + i and L t + i can be
random.
The maximum exposure estimate based on this model is given by the following expression

[15]

1.65
t+i

ME t + i = V t + i + A t + i e

t + i
i ---------------i
2

See, Margrabe, W., The Value of an Option to Exchange One Asset for Another, Journal of Finance, 33, (March
1978), 177-86.

RiskMetrics Monitor
First Quarter 1997
page 17

On measuring credit exposure (continued)

Using RiskMetrics methodology and data, we applied this technique to find the credit exposure of
the swaps presented above. The results for the 3 year USD par swap and a combined portfolio of 3 and
4 year swaps are presented in Table 11.
Table 11
Companys expected, maximum, and peak exposure by option pricing approach
3 year USD par swap and swap portfolio
Sample time (i)

Expected
exposure

E t + i
3 year swap
0
1
2
3
4
5
Swap portfolio
0
1
2
3
4
5
6
7

Maximum
exposure @ 95%
ME

Peak exposure

t+i

0
91,153
101,888
93,388
64,784
24,704

0
335,908
369,952
328,262
220,723
59,499

369,952

0
35,441
48,464
55,343
58,561
64,850
35,550
2,345

0
153,540
214,239
249,346
269,106
301,887
182,584
18,017

301,887

Comparing the results provided tables 6 and 10 with those presented in Table 11 shows that the expected, maximum and peak exposures produced by the statistical and option pricing approaches are very
similar. This should not be all that surprising since both models are using the same forward values and
RiskMetrics volatility estimates.

3. Credit exposure calculations using full simulation


In this section we describe how risk managers can simulate future swap rates in order to measure credit
exposure. Estimating credit exposure via interest rate simulation is motivated by the fact that credit exposure arises from changes in interest rates that occur after the swap contract is put into place.
We will describe the full simulation process using the 3 year USD par swap introduced earlier.
Lets look at the exposure from the point of view of the Company. If the Bank defaults 6 months after
settlement, the replacement cost for the Companythe cost to replace the Bank counterpartywould
be a function of the difference between the swap rate that prevailed when the swap was first purchased
(6.40%) and the 2 1/2 year par swap rate at the 6 month sampling time.
To determine the credit exposure to the Company at the 6 month sampling time we need to simulate the
distribution of 2 1/2 year USD par swap rates in 6 months since they are the rates that the Company
will be faced with default if occurs. To do the simulation we need the volatility of the 2 1/2 year rates
in 6 months, 6m, 2.5 ,which on January 24, 1997 is 12.07% [ 126 (days) x 1.076% (the current 2 1/2

RiskMetrics Monitor
First Quarter 1997
page 18

On measuring credit exposure (continued)


f

year volatility)]. Also, we need the 2 1/2 year forward rate 6 months forward, r 2.5 , which, on January
24, 1997, is 6.545%. We use the following formula to simulate 2 1/2 year par rates 6 months forward
f

r 2.5 = r 2.5 e

[16]

126 6m, 2.5 z

where z is a standard normal variate. Chart 7 presents a histogram of simulated 2 1/2 year par swap rates
Chart 7
Histogram of 2 1/2 year par swap rates, r 2.5
6 month forecast horizon
Frequency
400
350
300
250
200
150
100
50
0
4%

5%

6%

7%

8%

9%

10%

11%

2 1/2 year par swap rates

Next, we compute the replacement cost 6 months forward. This value is given by the difference
between the notional amount (USD 10,000,000) times the semi-annual difference between the fixed
rate of 6.40% and the simulated distribution of par rates. The distribution of replacement costs at 6
months from settlement is presented in Chart 8.

RiskMetrics Monitor
First Quarter 1997
page 19

On measuring credit exposure (continued)

Chart 8
Histogram of replacement cost
6 month mark
Frequency
350
300
250
200
150
100
50
0
$-122 $-87 $-51 $-16

$19

$55

$90 $125 $160 $196

Replacement cost ($000)

It is assumed that the distribution of replacement costs prevails over the remaining 2 1/2 years of the
swap which is 2 1/2 years. That is, we apply the distribution of replacement costs to each of the semiannual payment periods for the remaining maturity of the swap. We then discount these cashflows back
to the current time using the current spot curve. As in the analytic approach, the distribution of exposures at the 6 month mark are given by the positive values of replacement costs and all negative replacement costs become zero. Chart 9 presents the simulated exposures at the 6 month forecast horizon.
Chart 9
Distribution of exposures
6 month forecast horizon
Frequency
5000
4000
3000
500

250

0
$0

$156 $313 $469 $625 $781 $938 $1094 $1250 $1406


Exposure ($000)

The present value of the expected exposure at the 6 month sampling time is USD112,882. The 95th
percentile of the exposure distribution is USD431,091. Chart 9 presents the expected and maximum
exposures at the 6 month and remaining sampling times (1, 1 1/2, 2 and 2 1/2 years).

RiskMetrics Monitor
First Quarter 1997
page 20

On measuring credit exposure (continued)

Chart 10
Expected and maximum exposure profile based on simulation
Exposures are expressed as a percent of notional value
Exposure (percent of notional)
6%
5%

Maximum

4%
3%
2%
Expected
1%
0%
0

0.5

1.5

2.5

Sampling times

Table 12 presents Companys average and peak exposure estimates for the 3 and 4 year swaps, as well
as the portfolio, produced by all three approaches.
Table 12
Companys credit exposure using analytic and simulation approaches
Average expected exposure and peak exposure
Portfolio
3 year par IR swap
4 year par IR swap
Both swaps

Statistical
Average
Peak
62,484
365,243
57,084
453,332
34,128
289,125

Option pricing
Average
Peak
62,572
369,952
57,293
461,606
37,423
301,887

Simulation
Average
Peak
90,773
533,468
61,281
445,766
33,838
230,310

The results presented in Table 12 show that the average exposure measures for the 4 year swap and
swap portfolio are quite similar across all three methodologies. The average exposures of the 3 year
swap produced by the statistical and option pricing approaches are about two-thirds the exposure given
by simulation. Comparing peak exposures, we find as with average exposures, the statistical and option
pricing approaches offer similar results. Interestingly however, the simulation approach produces quite
different peak exposures. Relatively large differences between the non-simulation and simulation
results may be due two important factors. First, simulation uses par forward rates, rather than zero rates,
to simulate future rate distributions. And second, the non-simulation approaches use volatilities and
correlations based on zero rates whereas simulation applies volatilities and correlations on par rates.

4 Conclusions
This article has presented the computational details behind three methodologies for measuring credit
exposure. In so doing, our primary goal was to provide readers with details necessary to perform the
calculations. We defined credit exposure of a particular transaction as the amount subject to risk when
there is a change in the credit standing of a counterparty. We have used a simple swap portfolio to show
how to measure current exposure and estimate various levels of potential exposure by computing maximum, peak, expected and average exposure.

RiskMetrics Monitor
First Quarter 1997
page 21

On measuring credit exposure (continued)

Acknowledgments
The author would like to thank Chris Athaide, Mickey Bhatia, Guy Coughlan, Chris Finger and Jacques
Longerstaey for their constructive criticisms on earlier versions of this article.

Appendix
Basis convention
For the swaps presented in this article, the proper convention for computing interest payments is as follows. For the 3 year par swap, the fixed rate of 6.40% assumes a 360-day (bond basis) year. On the other
hand, the U.S. LIBOR is a money market yield based on a 360-day year. The precise formulas for
determining the fixed-rate and floating rate settlement cashflows are as follows:
Fixed-rate settlement payment
[A.1]

# of bond days
0.064 x ----------------------------------- $10 million
360

Floating-rate settlement payment


[A.2]

# of actual days
6-mo LIBOR x ------------------------------------- $10 million
360

Maximum exposure
We can derive the maximum exposure measure at the 95th percentile as follows:
0.95 = Probability(E t + i < ME t + i ) = Probability(E t + i < 0 )
+ Probability ( 0 < E t + i < ME t + i )
0

*
t + i dV t + i

[A.3]

ME t + i

* *
V t + i dV t + i

0
ME t + i

0.95 =

* *
V t + i dV t + i

*
where V t + i is the probability density function for the outstanding value of the transactions at time
*
t+i. Based on our assumptions, V t + i is the normal density function. Therefore, ME t + i at the 95%
confidence level is given by t + i + 1.65 t + i . Note that since ME t + i has a lower bound of 0, we can
write maximum exposure at time t+i as
ME t + i = max ( 0, t + i + 1.65 t + i )
Expected exposure
We now show we arrive at the expression for expected exposure at time t+i, E t + i . If we let E[x] denote
the mathematical expectation of some random variable x, then we can write the expected exposure as
[A.4]

E[E t + i ] =prob(E t + i = 0 ) E [ E t + i E t + i = 0 ] + prob ( E t + i > 0 ) E [ E t + i E t + i > 0 ]

RiskMetrics Monitor
First Quarter 1997
page 22

On measuring credit exposure (continued)

Since E [ E t + i E t + i = 0 ] = 0 , prob ( E t + i > 0 ) = 1 ( t + i t + i ) . and


[A.5]

( t + i t + i)
E [ E t + i E t + i > 0 ] = t + i -------------------------------------------------------- + t + i
[ 1 ( t + i t + i) ]

we get expression Eq.[A.4]. Using similar results we can derive an expression for the standard deviation of exposures as well.

References
Smithson, Charles, W., Smith, Clifford Jr., W and D. Sykes Wilford, (1995), Managing Financial Risk,
A Guide to Derivative Products, Financial Engineering, and Value Maximization, Irwin, London.

RiskMetrics Monitor
First Quarter 1997
page 23

The effect of EMU on risk management


Morgan Guaranty Trust Company
Risk Management Advisory
Jacques Longerstaey
(1-212) 648-4936
riskmetrics@jpmorgan.com

On January 1, 1999, if the currently agreed on calendar is respected, a number of European currencies
will disappear into history and be replaced by a common monetary unit called the Euro. The purpose
of this article is to review how this will affect how the frameworks for market risk management and
how specific products such as RiskMetrics will be impacted by the change. Since all of the details of
European Monetary Union (EMU) have yet to be ironed out and there is still uncertainty over whether
it will happen at all, the next few pages are just aimed at providing risk managers with an outline of
what will need to be done to firms risk management processes and systems by early 1999.
In particular, we will focus on which changes will need to be made to the Value-at-Risk (VaR) methodologies and the data commonly used to estimate VaR. Most of the articles written to date on the
implications of EMU for capital markets have focused on pricing instruments in a one currency core
Europe what will euro-yields be after 1999? While this is an important question, particularly for risk
managers in the run-off to monetary union, the focus of this article is on how EMU will change the risk
factors that affect the value of financial instruments and how these factors will be estimated in the first
few months of 1999.
One may question the opportune nature of spending time on an issue thats (1) still uncertain and (2) at
least 22 months away. Implementing risk measurement frameworks takes time however and most of
our comments will provide a general framework applicable to whatever currencies join monetary union
and whenever it actually happens. As firms invest in the processes and technology to manage market
risk in a VaR framework, some consideration should be given to designing systems to cope with the
potential changes resulting from EMU over the next couple of years.
The methods for estimating VaR, regardless of their statistical foundation, basically rely on a two step
process:
1. Identifying the risk factors that can affect the value of a financial instrument (foreign exchange, interest rate, equities...) and mapping the instruments to the respective risk factors (e.g., foreign exchange
forwards are exposed to both foreign exchange and interest rate risk).
2. Using historical risk factor data to estimate the maximum potential loss in the value of the position
with a given confidence percentile.
The process associated with these two steps will need to be revisited for the eight or so currencies which
are contenders for the first phase of EMU. Let us review how major classes of instruments are currently
treated in the RiskMetrics framework. From there, we will decompose the process in the steps mentioned above and review the alternatives for a post-EMU Europe.

Fixed income instruments are typically exposed to interest rate and potentially foreign
exchange rate risks (for those investors with positions in instruments denominated in a currency other than their base reporting currency)

Within fixed income, government bonds are decomposed into their component cash flows
(coupons+principal) and mapped to the volatility vertices by maturity. RiskMetrics currently
provides volatilities and correlations for the government bond zero rates of 17 markets, which
includes the most of the markets most likely to join EMU in 1999. The only exception is Finland for which we currently do not provide a term structure of government bond volatilities
(these can be approximated using swap rates). All other fixed income instruments are usually
mapped to the swap curve which incorporates some measure of non-sovereign credit risk
(basically AA bank risk)

RiskMetrics Monitor
First Quarter 1997
page 24

The effect of EMU on risk management (continued)

Post-EMU, fixed income instruments will be re-denominated in Euro and an alternative mapping framework will be required. We will discuss this in section 1.
Foreign exchange instruments (spot and the spot component of forward contracts) as well as
the currency exposure of instruments such as fixed income and equities are currently mapped
to their respective currencies. Post 1999, instruments re-denominated in Euro will be mapped
to the new currency. The absence of historical data (both volatility and correlation with regard
to non-EMU currencies and others such as USD and JPY) will temporarily reduce the usefulness of VaR models. In section 2, we will discuss potential proxies and how for a short period
following the introduction of the Euro, risk managers will need to use alternative approaches
to estimating market risk.
Equity instruments are currently mapped to domestic equity indices which will not be affected
by EMU (though it is possible that some consolidation in the equity markets will occur in later
stages of monetary union). The foreign exchange risk component of these investments will
require mapping to the Euro.

1. Mapping fixed income instruments under EMU


1.1. Government bonds
Though one might think that reducing the number of currencies would simplify VaR calculations,
assume 8 currencies convert to the Euro and the size of the RiskMetrics covariance matrix drops by
around 120 time series - its really not as simple as that. While the foreign exchange component of fixed
income instruments will disappear (for the ins) or be modified (for the outs), the bonds themselves
will continue to display interest rate risk characteristics that will be unique by market. In spite of currency union, Belgian bonds will still be affected by different liquidity and credit consideration than
German bunds and there will therefore be a continuing requirement to maintain a number of government yield curves for market risk measurement.
The issue is complicated by the fact that EMU will create an environment seen nowhere else: a supranational currency issued by an independent central bank (the level of its independence remains an
issue to be agreed upon by all parties concerned) but no supranational issuer with substantial tax and
spend powers to create a benchmark yield curve off of which to price all other fixed income instruments. European governments will be like Canadian provinces, albeit with superior tax and spend powers, without there being a Canadian government. Even with a common currency, there will continue to
exist a large non homogenous euro-government bond market which will increasingly be driven by credit considerations. These are unlikely to be very different at first than the ones that currently drive the
market however: by creating one currency and transferring responsibility for monetary policy to a European central bank, participants will just be setting a stone a practice that has been going on for years
with Europes monetary policy mainly being set by the Bundesbank.
This implies that for the first few months after the introduction of the Euro, we should be able to use
historical volatilities and correlations for the government bond term structures even though there will
have been a change in their currency denomination.
Longer term, credit valuation will become more important, particularly as member states will have lost
two components of financial flexibility: the first results from the transfer of monetary sovereignty to the
European Central Bank, the second from continued pressure to practice restrictive fiscal policy under

RiskMetrics Monitor
First Quarter 1997
page 25

The effect of EMU on risk management (continued)

the auspices of the so-called stability pact. Credit analysis will increasingly have to rely on measures
of a countrys internal access to capital as external data on balance of payments performance will no
longer be available. This could worsen the credit valuation of countries with high debt and little tax/
spend flexibility such as Belgium and Italy.
It is not impossible that over time, government bond term structures may be aggregated across countries whose markets display similar risk/return characteristics. For example we could fit a yield curve
model to the government bonds from Belgium, the Netherlands, Germany and France.
While the aggregated curve may look very similar to its individual market components, it is unlikely
that its behavior over time will perfectly match the movements of the respective bond markets. An indication of this is that while fixed income volatilities have converged over the last year or so, correlations remain somewhat unstable as shown by chart 1. While it is possible that following EMU some
aggregation will be possible (Germany and the Netherlands for example), the amount of basis risk resulting from reduced granularity that would result from adding France or Belgium in the data may still
be significant. Over the next 2 years, we will monitor the evolution in the risk profile of these markets
and use the data as a basis for deciding the post 1999 structure of the RiskMetrics datasets.
Chart 1
10-year government bond zero volatilities and correlations
in percent, daily horizon, 1.65 standard deviations
volatility (%)
1.4

Correlation
1.0

Dem to NLG

0.9

1.2

0.8
FRF

1.0

0.7
0.6

DEM

0.8

0.5
0.6

0.4
0.3

0.4
NLG
0.2
0.0
Jan 2, 95

Dem to FRF

0.2
0.1

Sep 11, 95

May 20, 96

Jan 27, 97

0.0
Jan 2, 95

Sep 11, 95

May 20, 96

Jan 27, 97

1.2. Swaps and non-government securities


Mapping swaps and non-government bond issues will be pose another set of problems. Following
monetary union, the swap markets of participating countries will become perfectly fungible. This will
result in a swap market that will be larger and probably more liquid than any of the underlying government bond markets and which is a potential candidate for a euro benchmark yield curve in the absence
of a real government curve. We may thus avoid having to make a benchmark decision between the
German bond market which is large but antiquated and its smaller but more modern French counterpart.
In the first few months of 1999, or whenever EMU actually starts, we will have no historical data on
the euro-swap curve for VaR purposes. This will require risk managers to respond in a variety of ways:

RiskMetrics Monitor
First Quarter 1997
page 26

The effect of EMU on risk management (continued)

1. Use models that quickly assimilate market data and respond rapidly to structural changes. This will
reduce the time required to collect the data required to estimate the variance of the new instruments.
The standard RiskMetrics approach which exponentially weights market data for the purpose of estimating volatility will prove superior in this environment than models which take longer to adjust. The
internal models approach mandated by the BIS (using 1-year of equally weighted data) in particular
will not be of very much use.
2. Identify proxy time series which will serve to estimate VaR during the first few months of EMU as
market data on the Euro is collected. Recent trends in volatility indicate how well a proxy time series
could work. If all of the ins display the same volatility profile by 1999, then choosing one of them as
the proxy for Euro-swap rates will be a reasonable alternative. Chart 2 below shows how 10-year swap
rate volatilities have converged over the last few years as expectations for EMU have risen.

Chart 2
10-year swap rate volatilities
in percent, daily horizon, 1.65 standard deviations
volatility (%)
4.0
3.5
3.0

FRF

2.5

DEM

2.0
1.5
1.0
NLG
0.5
0.0
Jan 2, 95

Sep 11, 95

May 20, 96

Jan 27, 97

3. Access alternative time series which can provide risk management systems with additional information such as implied volatility levels. If these deviate significantly from the historical proxies used,
bring this information to the attention of management as an indication that the markets are pricing in
higher levels of risk. A previous example of the value of such proxies was the 1992 exchange rate
mechanism crisis: implied Lira/mark volatilities started to rise in June, a full three months before the
actual devaluation of the lira (see chart 3).

RiskMetrics Monitor
First Quarter 1997
page 27

The effect of EMU on risk management (continued)

Chart 3
Lira/DM exchange rate and volatility levels
Lira/DEM
950

Lira/Dem implied volatility (%)


10
9
8

900

7
6
5

850

4
3

800

2
1

750
Apr 30, 92

Jul 22, 92

Oct 23, 92

0
Jan 22, 93

As we get closer to 1999, implied volatility levels in swaptions may prove an interesting temporary
proxy for using historical data for VaR purposes. Current swaption implied volatility curves are already
showing a hump in the 2-4 years sector consistent with the uncertainty surrounding monetary union
(see chart 4)

Chart 4
Swaption implied volatility levels
in percent annualized, 1 standard deviation
Volatility (basis points)
7.5
7.0
6.5
6.0

7.5

5.5

7.0

5.0

6.5

4.5

6.0

4.0

5.5

3.5
3.0
1 Yr.
3 Yr.
5 Yr.
7 Yr.
9 Yr.
Swap maturity 12 Yr.
20 Yr.
30 Yr.

5.0
4.5
4.0
3.5
3.0
15 Yr. 30 Yr.
10
Yr.
8
Yr.
6
Yr.
4 Yr.
1 Mo. 6 Mo. 1 Yr. 2 Yr.
Option maturity

RiskMetrics Monitor
First Quarter 1997
page 28

The effect of EMU on risk management (continued)

4. Perform stress tests to account for the uncertainty associated with the union process. EMU is the
ultimate event risk even though everyone has been forewarned about it. Over the next 3 years, Europes
financial markets could be rocked significantly by deviations in the process mapped out by the authorities.

2. Mapping foreign exchange exposure to the Euro


As is the case for the Euro-swap market, there will be no historical data available in January 1999 for
estimating the volatility of the Euro FX rates against the USD, JPY and other currencies both within
and outside Europe that will not participate in EMU. As a result, for the first few months of 1999
(RiskMetrics daily volatility estimates basically need 74 days worth of data), risk managers will need
to identify proxies to be used in the risk measurement process.
It is not possible at this time to define which currency will act as the best proxy for estimating Euro
foreign exchange risk in the first quarter of 1999. The choice will depend on what type of Euro we end
up with.
One thing is certain: using ECU exchange rate history will not make a lot of sense. Even though the
ECU denominated instruments will be converted to Euros at a 1/1 rate, both currencies are different in
nature: the ECU is an artificially constructed basket currency while the Euro will be a full fledged currency issued by a system of central banks. As such, its value and behavior will be governed by the same
economic fundamentals that drive the values of currencies issued by central banks around the world.
The markets perception of the euro will drive its value and volatility. If the European central banks
independence and monetary policy is modelled after the Bundesbank, then it could well be that the then
defunct Deutsche marks history may provide us with a useful proxy. Any questions or perceptions that
the Euro is to be a weaker currency as the result of basic questions about whether participating countries
share a common vision of EMU and the choice of a proxy may become elusive.
Given the nature of the process which is essentially political, it is likely that both the interest rate and
currency markets will be subject to periods of increased volatility under two different environments:
1. As we approach EMU and decisions on individuals countries participations are taken and;
2. As EMU is phased in and questions arise about its long term sustainability
Answering questions about market risks associated with these two environments will require much
more from risk managers than a simple VaR number. Scenario analysis along different paths of the
EMU decision tree (whos in, whos out, how long do they stay in) will be essential and require a
hefty dose of judgement. But then again, if EMU forces managers to reduce their sometimes blind faith
in their market risk models, risk management will have become a more mature practice.

RiskMetrics Monitor
First Quarter 1997
page 29

Streamlining the market risk measurement process


Morgan Guaranty Trust Company
Risk Management Research
Peter Zangari
(1-212) 648-8641
zangari_peter@jpmorgan.com

In this note we describe a simple and effective approach for calculating Value-at-Risk (VaR) that
reduces some of the computational burdens confronting todays risk managers. We propose a general
methodology to measure VaR that is based on what we refer to as the portfolio aggregation principle. The portfolio aggregation principle consists of three fundamental steps:
1. Construct a time series of daily portfolio returns from a current set of portfolio positions and daily
returns on individual securities.
2. Treat the portfolio return time series as a dynamic process (e.g., allow for time-dependent volatility).
3. Determine VaR by fitting a statistical model directly to the time series of daily portfolio returns. For
example, apply the RiskMetrics methodology to obtain the portfolio volatility.
This recommendation not only simplifies the process for computing VaR but also should produce results that are superior to current methods by enabling users to employ a variety of models, some of
which may include sophisticated analytics.
Notice that there is nothing inherently new in what we are suggesting, instead we exploit the notion that
a portfolios return is a weighted average1 of returns on individual securities. In other words, a portfolios return represents all relevant information contained in individual returns. Furthermore, since the
goal of VaR is to measure the market risk of a portfolio, it seems reasonable to model the portfolio return series directly.

1. Why this approach?


Until now, the debate over which VaR methodology to use has focused, for the most part, on two methodologies--the variance covariance method (VCV) and historical simulation (HS).
In the VCV model it is assumed that returns on individual securities follow a conditional multivariate
normal distribution. Therefore, to compute VaR, users need volatilities and correlations which describe
this distribution. VaR is defined in terms of a portfolios standard deviation (e.g., 1.65 times the standard deviation). The principal advantage of this model is that users can study directly the effect of correlations and volatilities on their VaR estimates. Its main drawbacks are two-fold. First, the VCV
model assumes that returns follow a conditional multivariate normal distribution. Second, the VCV
model may require the calculation of a very large covariance matrix whose properties depend on both
the number of individual securities represented in the covariance matrix and the number of historical
observations used to estimate the volatilities and correlations.
Risk managers who are unsatisfied with the VCV models assumptions or who just do not see the reason
for estimating many volatilities and correlations, propose historical simulation. According to this
methodology, users take a current portfolio and revalue its components at market rates over some past
period. This results in a distribution of portfolio returns from which users would calculate VaR as some
percentile (e.g., 5th) of this distribution. The principal advantage of this approach is that by computing
a historical time series of portfolio returns, HS does not have to deal with multivariate statistical issues.
Moreover, since a portfolio return is a weighted average of returns on individual securities, its statistical
properties tend to be more suitable for forecasting2. There are two main drawbacks to HS. First, it is
unclear, if not practically impossible, how to estimate VaR for forecast horizons over a week or longer.3
1 Where
2 More

the weights are given by the portfolios positions.


on this will be stated below.

RiskMetrics Monitor
First Quarter 1997
page 30

Streamlining the market risk measurement process (continued)

Second, it is often very difficult to estimate with confidence very small and large percentiles (e.g.,
smaller than 5 percent and larger than 95 percent) of the portfolio return distribution using tail statistics.
Now, we can overcome the drawbacks of VCV and HS by working directly with the portfolio return
series. That is, we need to first construct an historical time series of daily portfolio returns given all
underlying market data and the current set of portfolio weights (i.e., positions). And second, fit a
statistical model to the portfolio returns that not only describes the distribution of portfolio returns at
any point in time, but also models how returns evolve over time. The latter feature allows a natural way
for practitioners to use daily price data to produce VaR forecasts over long horizons.
Notice that unlike historical simulation, the purpose of portfolio aggregation is to estimate the parameters of a statistical model which determine the location and shape of the portfolio return distribution,
rather than attempt to estimate a tail statistic directly from the data. Therefore, with portfolio aggregation there is a more efficient use of the data in that all returns, large and small, are used to estimate the
parameters.
Table 1 summarizes important differences and similarities between the portfolio aggregation principle,
HS and VCV.
Table 1
A comparison of portfolio aggregation, HS and VCV model
Issue

Portfolio aggregation

HS

VCV

The required number of parameters to estimate


(suppose there are 600 historical
observations on 400 different securities in a given portfolio)

Using RiskMetrics , there is one This methodology estimates the


parameter, the portfolio volatility. percentiles used to determine VaR
Note, we may want to estimate the directly from the portfolio returns.
portfolio mean as well.

VaR forecasts for horizons


longer than 1 day

Relatively simple. We can apply Very difficult, if not practically


Simple, but limited by assumpdifferent types of volatility models impossible. Requires a lot of histor- tions on the underlying model.
to produce longer term forecasts.
ical data.

Accounting for skewness and


kurtosis
(ways to incorporate conditional
non-normality)

There are several ways to incorporate skewness and kurtosis into the
VaR forecast. For example, since
we are only dealing with one time
series, we can apply matching-moment algorithms to capture portfolio skewness or fit more
sophisticated, flexible volatility
models such as an EGARCH-GED
(see RiskMetrics Monitor, 4th
quarter, 1996)

3 This

Using RiskMetrics methodology, there are 80,200 parameters


without estimating the mean
80,600 parameters including the
mean

Historical skewness and kurtosis


Very cumbersome. Incorporating
are automatically accounted for
multivariate skewness and kurtosis
when estimating the portfolio return is intractable
distribution. A large sample size is
often required to get good estimates
of these statistics.

is because the assumptions underlying HS require that historical returns are independent of one another. Therefore,
if we were going to produce a one month VaR forecast we would need to work with monthly (non-overlapping) returns.
Obviously, this drastically cuts down of observed portfolio returns.

RiskMetrics Monitor
First Quarter 1997
page 31

Streamlining the market risk measurement process (continued)

Table 1 (continued)
A comparison of portfolio aggregation, HS and VCV model
Statistical Properties:
No covariance matrix required.
No covariance matrix required.
Covariance matrix required.
Issues related to covariance ma- By averaging individual returns, By averaging individual returns,
The VaR calculation can be very
trices
so-called outliers are smoothed out. so-called outliers are smoothed out. sensitive to the numerical precision
of the covariance matrix. The precision is related to the definiteness of
the covariance matrix.
Many individual time series tend
to have outliers.
Measuring risk of non-linear
positions

Work with either Taylor series ap- Work with either Taylor series
proximations or full revaluation
approximations or full revaluation

Use the RiskMetrics deltagamma approximation

Incorporate other statistical/


risk management models

Can try GARCH and Stochastic


Can try nonparametric density estivolatility models to estimate vola- mation to get better estimates of
tility. Also, can try to model entire percentiles.
pdf of portfolio in univariate framework.
With exponential weighting can
find one optimal decay factor per
portfolio.

Very difficult due to high dimensionality.


In RiskMetrics , essentially confined to using one decay factor for
all series.

Estimating/forecasting mean
returns

Can forecast the mean from a time Can get sample estimate of mean
series model or regression.
over some historical period.

Need estimates of the mean on


each time series

Allow macroeconomic variables to predict portfolio risk

Straightforward application of
multivariate regression model.

Not applicable.

Nothing done on this to date.

2. Demonstrating the portfolio aggregation principle


We now provide an example of how risk managers may compute VaR without estimating time series
specific volatilities and correlations. To keep things simple, consider a portfolio consisting of two positions where each position corresponds to a particular risk factor. The portfolio return at time t can be
written as follows:
r p, t = 1 r 1, t + 2 r 2, t
[1]
where 1 and 2 are the proportion of the total portfolio value allocated to each risk factor. For example, assume that a portfolio consists of two foreign exchange positions, one in Deutschemarks (DEM)
and the other in Italian lira (ITL). Further, suppose that the USD equivalent value of each of these
positions is USD100 mm and USD200 mm, respectively. In this case, the total value of this portfolio
would be USD300 mm and we would have 1 = 100 and 2 = 200 . Also, r 1, t and r 2, t represent the
returns on the USD/DEM and USD/ITL exchange rates, at time t, respectively.
Now, if one were to assume that returns are distributed according to the conditional normal distribution,
VaR (as a percent of the total position value) would be given by 1.65 times the standard deviation of
the return on the portfolio. The portfolio standard deviation, p, t , is a function of the variances of the
underlying returns and the correlation between the returns on the two exchange rates, i.e.,
[2]
where

p, t =

2 2

2 2

1 1, t + 2 2, t + 2 1 2 1, t 2, t 12, t

RiskMetrics Monitor
First Quarter 1997
page 32

Streamlining the market risk measurement process (continued)


2

1, t is the variance of r 1, t .
2

2, t is the variance of r 2, t .
12, t is the correlation between r 1, t and r 2, t .
We could compute the volatility of portfolio returns without having to compute the individual volatilities and correlation. Since we know the current portfolio weights (positions) and we have historical
time series on each of the individual returns, r 1, t and r 2, t , then we can compute a historical time series
of portfolio returns using Eq.[1] for each day and then take the standard deviation of this time series.
For example, suppose we want to construct a standard deviation forecast of the portfolio specified in
Eq.[1] based on 250 daily returns (t=1,...,250). Setting t = 1 as the most recent observation, the portfolio
return series is constructed as follows:

[3]

t =1

r p, 1 = 1 r 1, 1 + 2 r 2, 1

t =2

r p, 2 = 1 r 1, 2 + 2 r 2, 2

r p, 250 = 1 r 1, 250 + 2 r 2, 250

t = 250

It is simple to show that the standard deviation estimate based on portfolio returns ( r p, 1, , r p, 250 )
is equivalent to the standard deviation in Eq.[2]. We do so for the case when the standard deviation
weighs each portfolio return equally. The estimator for the variance of a portfolio using equal weighting
(across time) is given as follows4:
[4]

2
p, 1

1
= ---------
250

250

( r

1 1, t

+ 2 r 2, t )

t=1

We can re-write the right-hand side of Eq.[4] as follows


2
1
p, t = ---------
250

250

( r

1 1, t

+ 2 r 2, t )

t=1
250

[5]

1
= --------250

2 2

2 2

1 r 1, t + 2 r 2, t + 2 1 2 r 1, t r 2, t

t=1

=
=

2 1
1 -------- 250
2 2
1 1, t

250

t=1

2
2 1
r 1, t + 2 --------
250

2 2
1 2, t

250

t=1

1
2
r 2, t + 2 1 2 --------
250

+ 2 1 2 12, t 1, t 2, t

Taking the square root of both sides of Eq.[5], we get


[6]

4 Note

p, t =

2 2

2 2

1 1, t + 2 2, t + 2 1 2 1, t 2, t 12, t

that we are assuming that the mean of the portfolio return over one day is zero.

250

(r

t=1

1, t r 2, t )

RiskMetrics Monitor
First Quarter 1997
page 33

Streamlining the market risk measurement process (continued)

which is the same as the expression for volatility given by the VCV method (see Eq.[2]). Consequently,
users can compute VaR directly by using a portfolios weights and historical returns on individual
positions, rather than computing the individual volatilities and correlations for each time series.

3. Important advantages of working with portfolio returns


There are two important reasons for preferring to work directly with portfolio rather than individual
returns. First, univariate models are much simpler and tractable than multivariate models. For example,
in the RiskMetrics Monitor (4th quarter, 1996) we presented a distribution to model financial returns
known as the generalized error distribution (GED). The GED is quite flexible in that it allows for socalled fat-tails often observed in financial returns and it is relatively simple to work with when only
one time series is considered. Conversely, when multiple time series are considered the estimation of
the parameters of this model gets very complicated.5 Therefore, the portfolio aggregation principle is
very useful when risk managers prefer to measure the VaR of a portfolio while assuming that portfolio
returns follow the GED.
The second reason for preferring to work with portfolio returns is that the distribution of portfolio
returns tends to be well behaved in a statistical sense relative to individual returns. To explain what
we mean by well behaved, the following two tables provide information on 20 foreign exchange and
22 money market rates as well as portfolios composed of foreign exchange and market rates. The portfolios were constructed using equal weights.
In Table 2 we present 5 statistics for 20 foreign exchange series. For each series, we compute daily returns and then find the RiskMetricslike optimal decay factor associated with one-day VaR forecasts. Here, optimal is defined in terms of the decay factor that makes the daily returns divided by
the standard deviation forecasts most normal. Once we settle on an optimal decay factor we create a
series of standardized returnsdaily returns divided by their respective standard deviation forecast.
For each foreign exchange series, if the assumption of conditional normality holds, then we should
expect the skewness coefficient to be zero, the kurtosis coefficient (a measure of fat-tails) to be 3. In
addition, we would expect the Shapiro-Wilks normality test to be 0.999 and the mean and standard
deviation of standardized returns to be 0 and 1, respectively. As a benchmark, the last row of Table 2
shows these statistics for a simulated series of conditional normal returns.

5 Currently,

we are not aware of robust estimation methods for the multivariate GED distribution

RiskMetrics Monitor
First Quarter 1997
page 34

Streamlining the market risk measurement process (continued)

Table 2
Foreign exchange
Based on daily returns over the period June 1991 - February 1996

Austrian schilling
Australian dollar
Belgian franc
Canadian dollar
Swiss franc
German mark
Danish krona
Spanish peseta
French franc
Finnish mark
British pound
Hong Kong dollar
Irish pound
Italian lira
Japanese yen
Dutch guilder
Norweigian krona
New Zealand dollar
Portuguese escudo
Swedish krona
Portfolio
Simulated Normal

Optimal
Skewness
Kurtosis
decay factor c o e f f i c i e n t c o e f f i c i e n t
0.935
-0.1827
4.45
0.980
-0.1332
4.46
0.940
0.0205
5.12
0.955
-0.0683
4.27
0.950
-0.0936
4.47
0.940
-0.159
4.52
0.960
-0.2444
4.59
0.905
-0.3644
6.43
0.945
-0.1564
4.60
0.995
-5.3418
77.41
0.955
-0.159
4.76
0.850
1.096
10.33
0.990
-2.3643
32.41
0.935
-0.4586
5.80
0.965
0.0664
6.14
0.950
-0.0627
4.60
0.975
-0.5388
8.89
0.995
-0.5389
7.48
0.925
-0.2013
6.03
0.985
-0.7355
9.81
0.955
-0.2524
5.03
0.990
0.0123
3.19

Normality
Test
0.993
0.990
0.990
0.994
0.993
0.993
0.992
0.987
0.993
0.889
0.991
0.962
0.945
0.989
0.980
0.992
0.980
0.977
0.986
0.976
0.990
0.999

Mean
0.0072
0.0033
0.0117
-0.043
0.0185
0.0036
0.0116
-0.0282
0.0067
0.0128
-0.0082
0.056
-0.0028
-0.0393
0.0406
0.0054
-0.0029
0.052
-0.0066
-0.0158
-0.0049
0.0077

Standard
deviation
1.0397
1.0017
1.046
1.0139
1.0344
1.0415
1.0319
1.0828
1.0411
1.3086
1.0462
1.1746
1.1512
1.0691
1.0199
1.0353
1.0466
1.0201
1.0664
1.0575
1.0382
1.0104

Table 2 shows that all of the time series are not conditionally normal although some series such as the
Canadian dollar and German mark are quite close. Other series, such as the Finnish mark and Irish
pound are highly non-normal. Notice that the portfolio consisting of 20 equally weighted currencies is
relatively close to normality (its skewness and kurtosis are -0.25 and 5.03, respectively), even though
the portfolio contains very non-normal time series. This is evidence that portfolio aggregation mitigates
the effect of the very non-normal time series.
A further set of statistics that underscore the profound effect that aggregation has on the distribution of
portfolio returns are presented in Table 3. Table 3 shows the optimal decay factor, skewness and
kurtosis coefficients, normality test, mean and standard deviation for 22 money market rates. These
rates are much more non-normal than the foreign exchange series.

RiskMetrics Monitor
First Quarter 1997
page 35

Table 3
Money Market Rates
Based on daily returns over the period June 1991 - February 1996
Optimal
Skewness
Kurtosis
decay factor c o e f f i c i e n t c o e f f i c i e n t
Austria 30 day
0.995
1.314
21.14
Austria 90 day
0.995
0.858
8.61
Austria 180 day
0.985
0.556
16.56
Finland 30 day
0.915
1.912
28.22
Finland 90 day
0.945
0.983
13.21
Finland 180 day
0.960
0.434
7.94
Finland 360 day
0.985
0.842
16.22
Ireland 30 day
0.905
10.558
239.79
Ireland 90 day
0.850
3.352
62.70
Ireland 180 day
0.945
-0.143
20.31
Ireland 360 day
0.985
3.296
62.74
Norway 30 day
0.905
5.175
76.87
Norway 90 day
0.890
8.328
165.65
Norway 180 day
0.890
10.247
213.88
New Zealand 30 day
0.850
-0.334
24.14
New Zealand 90 day
0.850
0.766
17.82
New Zealand 180 day
0.850
1.031
13.46
Portugal 30 day
0.855
3.070
44.50
Portugal 90 day
0.950
4.708
91.09
Portugal 180 day
0.970
-1.069
38.36
Portugal 360 day
0.990
-8.996
203.72
US 90 day
0.975
0.152
12.22
Portfolio
0.950
-1.015
22.34
Simulated Normal
0.995
0.076
2.83

Normality
Test
0.883
0.919
0.877
0.929
0.949
0.968
0.864
0.810
0.895
0.923
0.880
0.887
0.864
0.844
0.932
0.947
0.956
0.893
0.815
0.875
0.804
0.873
0.939
0.999

Mean
0.0774
0.0774
0.1074
0.0689
0.0604
0.0522
0.0682
0.1264
0.1034
0.0771
0.0643
0.0863
0.0962
0.1146
0.0577
0.0637
0.0693
0.1327
0.0739
0.05
0.0253
-0.0161
0.0847
-0.0169

Standard
deviation
0.8959
0.8691
1.0252
1.2019
1.1368
1.0823
1.1012
1.3776
1.2881
1.1136
1.1983
1.2576
1.2942
1.342
1.243
1.2123
1.1927
1.3389
1.3276
1.1646
1.3479
1.0902
1.1316
0.975

The results in Table 3 show that the portfolio return distribution has the 4th highest normality test
statistic and the 9th smallest kurtosis coefficient even though some time series are extremely nonnormal (e.g., Portugal 30 day rate).

4. Concluding remarks
Although much has been written and discussed about market risk measurement methodologies, it seems
that risk managers have yet to acknowledge the portfolio aggregation principle suggested in this note.
Nevertheless, for risk managers who seek a flexible and efficient methodology for measuring market
risk, a strong case could be made for estimating VaR by fitting a statistical model directly to the time
series of portfolio returns.

RiskMetrics Monitor
First Quarter 1997
page 36

RiskMetrics Monitor
First Quarter 1997
page 37

RiskMetrics Monitor
First Quarter 1997
page 38

RiskMetrics Monitor
Fourth quarter 1996
page 39

Previous editions of the RiskMetrics Monitor


4th Quarter 1997: December 19, 1996
Testing RiskMetrics volatility forecasts on emerging markets data.
When is non-normality a problem? The case of 15 time series from emerging markets.

3rd Quarter 1996: September 16, 1996


Accounting for pull to par and roll down for RiskMetrics cash flows.
How accurate is the delta-gamma methodology.
VaR for basket currencies.

2nd Quarter 1996: June 11, 1996


An improved RiskMetrics methodology to help risk managers avoid underestimating VaR.
A Value-at-Risk analysis of foreign exchange flows exposed to OECD and emerging market
currencies, most of which are not yet covered by the RiskMetrics data sets.
Estimating index tracking error for equity portfolios in the context of principal variables that
influence the process of portfolio diversification.

1st Quarter 1996: January 23, 1996

Basel Committee revises market risk supplement to 1988 Capital Accord.

A look at two methodologies that use a basic delta-gamma parametric VaR precept but achieve
results similar to simulation.

4th Quarter 1995: October 12, 1995


Exploring alternative volatility forecasting methods for the standard RiskMetrics monthly
horizon.
How accurate are the risk estimates in portfolios that contain Treasury bills proxied by LIBOR
data.
A solution to the standard cash flow mapping algorithm, which sometimes leads to imaginary
roots.

3rd Quarter 1995: July 5, 1995


Mapping and estimating VaR for interest rate swaps
Adjusting correlations obtained from nonsynchronous data.

RiskMetrics Monitor
Fourth quarter 1996
page 40

RiskMetrics products

Worldwide RiskMetrics contacts

Introduction to RiskMetrics: An eight-page document that


broadly describes the RiskMetrics methodology for
measuring market risks.

For more information about RiskMetrics, please contact the


authors or any other person listed below.

RiskMetrics Directory: Available exclusively on-line, a list


of consulting practices and software products that incorporate
the RiskMetrics methodology and data sets.

New York

Jacques Longerstaey (1-212) 648-4936


longerstaey_j@jpmorgan.com

Chicago

Michael Moore (1-312) 541-3511


moore_mike@jpmorgan.com

Mexico

Beatrice Sibblies (52-5) 540-9554


sibblies_beatrice@jpmorgan.com

San Francisco

Paul Schoffelen (1-415) 954-3240


schoffelen_paul@jpmorgan.com

Toronto

Dawn Desjardins (1-416) 981-9264


desjardins_dawn@jpmorgan.com

RiskMetricsTechnical Document: A manual describing


the RiskMetrics methodology for estimating market risks. It
specifies how financial instruments should be mapped and
describes how volatilities and correlations are estimated in
order to compute market risks for trading and investment
horizons. The manual also describes the format of the volatility
and correlation data and the sources from which daily updates
can be downloaded. Available in printed form as well as Adobe
pdf format.
RiskMetrics Monitor: A quarterly publication that
discusses broad market risk management issues and statistical
questions as well as new software products built by third-party
vendors to support RiskMetrics.
RiskMetrics data sets: Two sets of daily estimates of future
volatilities and correlations of approximately 480 rates and
prices, with each data set totaling115,000+ data points. One set
is for computing short-term trading risks, the other for mediumterm investment risks. The data sets currently cover foreign
exchange, government bond, swap, and equity markets in up to
31 currencies. Eleven commodities are also included.
A RiskMetrics Regulatory data set, which incorporates the
latest recommendations from the Basel Committee on the use
of internal models to measure market risk, is also available.

North America

Europe
London

Guy Coughlan (44-71) 325-5384


coughlan_g@jpmorgan.com

Brussels

Laurent Fransolet (32-2) 508-8517


fransolet_l@jpmorgan.com

Paris

Ciaran OHagan (33-1) 4015-4058


ohagan_c@jpmorgan.com

Frankfurt

Robert Bierich (49-69) 712-4331


bierich_r@jpmorgan.com

Milan

Roberto Fumagalli (39-2) 774-4230


fumagalli_r@jpmorgan.com

Madrid

Jose Antonio Carretero (34-1) 577-1299


carretero_jl@jpmorgan.com

Zurich

Viktor Tschirky (41-1) 206-8686


tschirky_v@jpmorgan.com

Asia
Singapore

Michael Wilson (65) 326-9901


wilson_mike@jpmorgan.com

Tokyo

Yuri Nagai (81-3) 5573-1168


nagai_y@jpmorgan.com

Hong Kong

Martin Matsui (85-2) 973-5480


matsui_martin@jpmorgan.com

Australia

Debra Robertson (61-2) 551-6200


robertson_d@jpmorgan.com

RiskMetrics is based on, but differs significantly from, the market risk management systems developed by J.P. Morgan for its own use. J.P. Morgan does not warrant any results
obtained from use of the RiskMetrics data, methodology, documentation or any information derived from the data (collectively the Data) and does not guarantee its sequence,
timeliness, accuracy, completeness or continued availability. The Data is calculated on the basis of historical observations and should not be relied upon to predict future market
movements. The Data is meant to be used with systems developed by third parties. J.P. Morgan does not guarantee the accuracy or quality of such systems.
Additional information is available upon request. Information herein is believed to be reliable, but J.P. Morgan does not warrant its completeness or accuracy. Opinions and estimates constitute our judgement and are
subject to change without notice. Past performance is not indicative of future results. This material is not intended as an offer or solicitation for the purchase or sale of any financial instrument. J.P. Morgan may hold a
position or act as market maker in the financial instruments of any issuer discussed herein or act as advisor or lender to such issuer. Morgan Guaranty Trust Company is a member of FDIC and SFA. Copyright 1996 J.P.
Morgan & Co. Incorporated. Clients should contact analysts at and execute transactions through a J.P. Morgan entity in their home jurisdiction unless governing law permits otherwise.

You might also like