Top Banner
Selection of Research Material relating to RiskMetrics Group CDO Manager E: [email protected] W: www.riskmetrics.com T: 020 7842 0260 F: 020 7842 0269
140

Selection of Research Material relating to RiskMetrics Group CDO Manager

Jun 15, 2015

Download

Economy & Finance

quantfinance

Selection of Research Material relating to RiskMetrics Group CDO Manager
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Selection of Research Material relating to RiskMetrics Group CDO Manager

Selection of Research Material relating to RiskMetrics Group CDO Manager

E: [email protected] W: www.riskmetrics.com

T: 020 7842 0260 F: 020 7842 0269

Page 2: Selection of Research Material relating to RiskMetrics Group CDO Manager

CONTENTS

1. Introductory Technical Note on the CDO Manager Software.

2. A comparison of stochastic default rate models, Christopher C. Finger. RiskMetrics Group Working Paper Number 00-02

3. On Default Correlation: A Copula Function Approach, David X. Li. RiskMetrics Group Working Paper Number 99-07

4. The Valuation of the ith-to-Default Basket Credit Derivatives, David X. Li. RiskMetrics Group Working Paper.

5. Worst Loss Analysis of BISTRO Reference Portfolio, Toru Tanaka, Sheikh Pancham, Tamunoye Alazigha, Fuji Bank. RiskMetrics Group CreditMetrics Monitor April 1999.

6. The Valuation of Basket Credit Derivatives, David X. Li. RiskMetrics Group CreditMetrics Monitor April 1999.

7. Conditional Approaches for CreditMetrics Portfolio Distributions, Christopher C. Finger. RiskMetrics Group CreditMetrics Monitor April 1999.

Page 3: Selection of Research Material relating to RiskMetrics Group CDO Manager

Product Technical Note�

Page 1

CDO Model - Key Features �,�� ,QWURGXFWLRQ�,,�� ,VVXHV�LQ�PRGHOOLQJ�&'2�VWUXFWXUHV�,,,�� /LPLWDWLRQV�RI�3UHVHQW�$SSURDFKHV�

,9�� (QKDQFHG�&UHGLW0HWULFV�EDVHG�0HWKRGRORJ\�9�� &'2�0RGHO�)ORZFKDUW�9,�� 6DPSOH�5HVXOWV� ��

,�� ,QWURGXFWLRQ�7KH� &'2�PRGHO� DOORZV� \RX� WR� DQDO\VH� FDVK� IORZ� &'2·V�� $� FRPSUHKHQVLYH�0RQWH� &DUOR�IUDPHZRUN� GLIIHUV� IURP� H[LVWLQJ� FDVK� IORZ� PRGHOV� XVHG� E\� PDQ\� VWUXFWXUHUV� DQG� UDWLQJ�DJHQFLHV�� LQ� WKDW�ZH�JHQHUDWH� D�PRUH� FRPSOHWH� VHW�RI� VFHQDULRV� LQVWHDG�RI� MXVW� OLPLWHG� VWUHVV�WHVWLQJ� VFHQDULRV� VHW� E\� XVHUV�� 7KH� PRGHO� LV� PXOWL�SHULRG� ZKHUH� WKH� G\QDPLFV� RI�FROODWHUDOLVDWLRQ� DQG� DVVHW� WHVWV� DUH� FDSWXUHG� LQ� WKLV� IUDPHZRUN�� ,QVWHDG� RI� DSSUR[LPDWLQJ� D�FRUUHODWHG� KHWHURJHQHRXV� SRUWIROLR� E\� DQ� LQGHSHQGHQW� KRPRJHQHRXV� SRUWIROLR�� DV� XVHG� E\�VRPH�DJHQFLHV��WKH�PRGHO�WDNH�LQWR�FRQVLGHUDWLRQ�DOO�FROODWHUDO�DVVHWV��

,,�� ,VVXHV�LQ�PRGHOOLQJ�&'2�VWUXFWXUHV�

• 6WUXFWXUHV�YDU\��GLIILFXOW�WR�PRGHO�DOO�YDULDWLRQV��

• 1R�OLTXLG�VHFRQGDU\�PDUNHW������

• &XUUHQW�HYDOXDWLRQ�EDVHG�RQ�DJHQF\�UDWLQJ�

o DW�RULJLQDWLRQ��SOXV�

o UDWLQJ�GRZQJUDGH�VXUSULVHV�

• 7KH�IHZ�DYDLODEOH�WRROV�ODFN�D�FRQVLVWHQW�SRUWIROLR�EDVHG�FUHGLW�PHWKRGRORJ\��

• 1HHG� IRU� DQ� LQGHSHQGHQW� ULVN� DVVHVVPHQW� WRRO��ZLWK�ZHOO�GHILQHG�FUHGLW�PHWKRGRORJ\��ZKLFK�LQFRUSRUDWHV�D�VWRFKDVWLF�SURFHVV��

,,,�� /LPLWDWLRQV�RI�3UHVHQW�$SSURDFKHV�

• 0RVW�DVVXPH�FRQVWDQW�JOREDO�DQQXDO�GHIDXOW�UDWH�IRU�DVVHWV�LQ�FROODWHUDO�SRRO��(�J�����RI�DVVHWV�GHIDXOW�LQ��VW�\HDU�������QG�\HDU�DQG�VR�RQ��

o 6LPSOH�EDVH�FDVH��EXW�QRW�UHDOLVWLF�DV�WLPLQJ�RI�ORVVHV�LQ�&'2�LV�H[WUHPHO\�LPSRUWDQW��

o 7UHDWV�FROODWHUDO�DV�D�KRPRJHQHRXV�SRRO�RI�DVVHWV��,Q�UHDOLW\��ZH�PD\�KDYH�����LVVXHV��YDU\LQJ� IHDWXUHV�� DQG� FRPSOH[� FRUUHODWLRQ� VWUXFWXUHV� ZKLFK� DUH� VWURQJO\� QRQ�KRPRJHQHRXV��

Page 4: Selection of Research Material relating to RiskMetrics Group CDO Manager

Product Technical Note�

Page 2

�/LPLWDWLRQV�RI�3UHVHQW�$SSURDFKHV�FRQWLQXHG«��

• /LPLWHG�WR�IURQW�ORDGHG�GHIDXOW�UDWH�DQDO\VLV��,�H��PRVW�GHIDXOWV�LQ�ILUVW�IHZ�\HDUV��

• 0RVW�VLPXODWH�GHIDXOW�UDWHV�RQO\��

o 'RHV�QRW�FDSWXUH�DVVHW�VSHFLILF�GHIDXOW��

o 'HILFLHQW�LQ�DQDO\VLQJ�0H]]DQLQH�QRWHV�DQG�(TXLW\�LQYHVWPHQWV��

,9�� (QKDQFHG�&UHGLW0HWULFV�EDVHG�0HWKRGRORJ\��

• 7UHDW� GHIDXOW� ULVN� DW� 2EOLJRU� RU� $VVHW� OHYHO�� 7KLV� WDNHV� LQWR� DFFRXQW� REOLJRU� VSHFLILF� ULVN��LQGXVWU\�RU�VHFWRU�ULVN��DQG�FRUUHODWLRQV�EHWZHHQ�DVVHWV�XVLQJ�&UHGLW0HWULFV�PHWKRGRORJ\��

• %XLOG� D� VFHQDULR� RI� ´'HIDXOW� WLPHµ� IRU� HDFK� DVVHW�� WKHUHE\� HIIHFWLQJ� WLPLQJ� RI� FDVK� IORZV�UHFHLYHG��DQG�GHOD\V�LQ�UHFRYHU\��

• *HQHUDWH�VFHQDULRV�RI�GHIDXOW�WLPHV�IRU�HDFK�DVVHW� LQ�WKH�FROODWHUDO�SRRO�XVLQJ�D�0RQWH�&DUOR�VLPXODWLRQ� SURFHVV�� )URQW� ORDGHG� GHIDXOWV� DQDO\VLV� ZRXOG� EH� DFFRXQWHG� IRU� LQ� VRPH� RI� WKH�VFHQDULRV��

• 5HVXOWV�DQDO\VLV�FDQ�IRFXV�RQ�DGYHUVH�VFHQDULRV�ZLWK�ZRUVW�VLPXODWHG�ULVN�UHWXUQ��

• $JJUHJDWH�UHVXOWLQJ�FDVK�IORZV�DQG�DOORFDWH�RYHU�WKH�GLVWULEXWLRQ�VWUXFWXUH��FRPSXWLQJ�UHVXOWV�DW�HDFK�FRXSRQ�SHULRG��DQG�RYHU�WKH�OLIH�RI�WKH�GHDO��3HUIRUPDQFH�LV�WKXV�SDWK�GHSHQGHQW�RQ�WLPLQJ�DQG�VHYHULW\�RI�ORVVHV��

• &DQ�PRGHO�DQG�DQDO\VH�FDVK�IORZV�ZLWK�DVVXPSWLRQ�RI�QR�PDQDJHU�LQWHUYHQWLRQ��3URYLGHV�DQ�H[SHFWHG�DQG�ZRUVW�FDVH�DJDLQVW�ZKLFK�WR�DVVHVV�PDQDJHUV·�SHUIRUPDQFH��

• &RPELQLQJ�WKH�&UHGLW0HWULFV�DSSURDFK�ZLWK�&RSXOD�IXQFWLRQV�DOORZV�XV�WR�PRGHO�GHIDXOW�RYHU�PXOWLSOH�SHULRGV��7KLV�H[WHQGV�RXU�RQH�SHULRG�IUDPHZRUN�XVHG�LQ�&UHGLW0HWULFV��

Page 5: Selection of Research Material relating to RiskMetrics Group CDO Manager

Product Technical Note�

Page 3

9�� &'2�0RGHO�)ORZFKDUW�

9,�� 6DPSOH�UHVXOWV� 6DPSOH�UHVXOWV�IRU���ZRUVW�FDVH�VFHQDULRV�IURP�VLPXODWLRQV�RQ�WKH�VHQLRU�WUDQFKH�RI�D�JHQHULF�VWUXFWXUH�

Yield vs Collateral Loss

4.73%

4.73%

4.73%

4.74%

4.74%

4.74%

4.74%

0 1 1 2 2 3 3

Collateral Loss (in thousands)

Yie

ld

Duration vs Collateral Loss

2.86

2.87

2.88

2.89

2.90

2.91

2.92

2.93

2.94

2.95

2.96

0 1 1 2 2 3 3

Collateral Loss (in thousands)

Du

rati

on

Yield vs Average Life

4.73%

4.73%

4.73%

4.74%

4.74%

4.74%

4.74%

3.08 3.10 3.12 3.14 3.16 3.18 3.20

Average Life

Yie

ld

Average Life vs Collateral Loss

3.08

3.10

3.12

3.14

3.16

3.18

3.20

0 1 1 2 2 3 3

Collateral Loss (in thousands)

Ave

rag

e L

ife

Page 6: Selection of Research Material relating to RiskMetrics Group CDO Manager

The RiskMetrics GroupWorking Paper Number 00-02

A comparison of stochastic default rate models

Christopher C. Finger

This draft: August 2000First draft: July 2000

44 Wall St. [email protected] York, NY 10005 www.riskmetrics.com

Page 7: Selection of Research Material relating to RiskMetrics Group CDO Manager

A comparison of stochastic default rate models

Christopher C. Finger

August 2000

Abstract

For single horizon models of defaults in a portfolio, the effect of model and distribution choice on themodel results is well understood. Collateralized Debt Obligations in particular have sparked interest indefault models over multiple horizons. For these, however, there has been little research, and there is littleunderstanding of the impact of various model assumptions. In this article, we investigate four approachesto multiple horizon modeling of defaults in a portfolio. We calibrate the four models to the same set ofinput data (average defaults and a single period correlation parameter), and examine the resulting defaultdistributions. The differences we observe can be attributed to the model structures, and to some extent,to the choice of distributions that drive the models. Our results show a significant disparity. In the singleperiod case, studies have concluded that when calibrated to the same first and second order information,the various models do not produce vastly different conclusions. Here, the issue of model choice is muchmore important, and any analysis of structures over multiple horizons should bear this in mind.

Keywords: Credit risk, default rate, collateralized debt obligations

Page 8: Selection of Research Material relating to RiskMetrics Group CDO Manager

1 Introduction

In recent years, models of defaults in a portfolio context have been well studied. Three separate

approaches (CreditMetrics, CreditRisk+, and CreditPortfolioView1) were made public in 1997.

Subsequently, researchers2 have examined the mathematical structure of the various models. Each

of these studies has revealed that it is possible to calibrate the models to each other and that the

differences between the models lie in subtle choices of the driving distributions and in the data

sources one would naturally use to feed the models.

Common to all of these models, and to the subsequent examinations thereof, is the fact that the

models describe only a single period. In other words, the models describe, for a specific risk horizon,

whether each asset of interest defaults within the horizon. The timing of defaults within the risk

horizon is not considered, nor is the possibility of defaults beyond the horizon. This is not a flaw

of the current models, but rather an indication of their genesis as approaches to risk management

and capital allocation for a fixed portfolio.

Not entirely by chance, the development of portfolio models for credit risk management has coin-

cided with an explosion in issuance of Collateralized Debt Obligations (CDO’s). The performance

of a CDO structure depends on the default behavior of a pool of assets. Significantly, the depen-

dence is not just on whether the assets default over the life of the structure, but also on when the

defaults occur. Thus, while an application of the existing models can give a cursory view of the

structure (by describing, for instance, the distribution of the number of assets that will default over

the structure’s life), a more rigorous analysis requires a model of the timing of defaults.

In this paper, we will survey a number of extensions of the standard single-period models that allow

for a treatment of default timing over longer horizons. We will examine two extensions of the Cred-

itMetrics approach, one that models only defaults over time and a second that effectively accounts

1See Wilson (1997).

2See Finger (1998), Gordy (2000), and Kolyoglu and Hickman (1998).

1

Page 9: Selection of Research Material relating to RiskMetrics Group CDO Manager

for rating migrations. In addition, we will examine the copula function approach introduced by Li

(1999 and 2000), as well as a simple version of the stochastic intensity model applied by Duffie

and Garleanu (1998).

We will seek to investigate the differences in the four approaches that arise from model – rather than

data – differences. Thus, we will suppose that we begin with satisfactory estimates of expected

default rates over time, and of the correlation of default events over one period. Higher order

information, such as the correlation of defaults in subsequent periods or the joint behavior of three

or more assets, will be driven by the structure of the models. The analysis of the models will

then illuminate the range of results that can arise given the same initial data. Nagpal and Bahar

(1999) adopt a similar approach in the single horizon context, investigating the range of possible

full distributions that can be calibrated to first and second order default statistics.

In the following section, we present terminology and notation to be used throughout. We proceed to

detail the four models. Finally, we present two comparison exercises: in the first, we use closed form

results to analyze default rate volatilities and conditional default probabilities, while in the second,

we implement Monte Carlo simulations in order to investigate the full distribution of realized default

rates.

2 Notation and terminology

In order to compare the properties of the four models, we will consider a large homogeneous pool

of assets. By homogeneous, we mean that each asset has the same probability of default (first order

statistics) at every time we consider; further, each pair of assets has the same joint probability of

default (second order statistics) at every time.

To describe the first order statistics of the pool, we specify thecumulative default probabilityqk

– the probability that a given asset defaults in the nextk years – fork = 1, 2, . . . T , whereT is

the maximum horizon we consider. Equivalently, we may specify themarginal default probability

2

Page 10: Selection of Research Material relating to RiskMetrics Group CDO Manager

pk – the probability that a given asset defaults in yeark. Clearly, cumulative and marginal default

probabilities are related through

qk = qk−1 + pk, for k = 2, . . . , T . (1)

It is important to distinguish a third equivalent specification, that ofconditional default probabilities.

The conditional default probability in yeark is defined as the conditional probability that an asset

defaults in yeark, given that the asset has survived (that is, has not defaulted) in the firstk−1 years.

This probability is given bypk/(1 − qk−1).

Finally, to describe the second order statistics of the pool, we specify thejoint cumulative default

probabilityqj,k – the probability that for a given pair of assets, the first asset defaults sometime in

the firstj years and the second defaults sometime in the firstk years – or equivalently, thejoint

marginal default probabilitypj,k – the probability that the first asset defaults in yearj and the

second defaults in yeark. These two notions are related through

qj,k = qj−1,k−1 +j−1∑i=1

pi,k +k−1∑i=1

pj,i + pj,k, for j, k = 2, . . . , T . (2)

In practice, it is possible to obtain first order statistics for relatively long horizons, either by observing

market prices of risky debt and calibrating cumulative default probabilities as in Duffie and Singleton

(1999), or by taking historical cumulative default experience from a study such as Keenan et al (2000)

or Standard & Poor’s (2000). Less information is available for second order statistics, however, and

therefore we will assume that we can obtain the joint default probability for the first year (p1,1)3,

but not any of the joint default probabilities for subsequent years. Thus, our exercise will be to

calibrate each of the four models to fixed values ofq1, q2, . . . qT andp1,1, and then to compare the

higher order statistics implied by the models.

The model comparison can be a simple task of comparing values ofp1,2, p2,2, q2,2, and so on.

However, to make the comparisons a bit more tangible, we will consider the distributions ofrealized

3This is a reasonable supposition, since all of the single period models mentioned previously essentially requirep1,1 as an input.

3

Page 11: Selection of Research Material relating to RiskMetrics Group CDO Manager

default rates. The term "default rate" is often used loosely in the literature, without a clear notion

of whether default rate is synonymous with default probability, or rather is itself a random variable.

To be clear, in this article, default rate is a random variable equal to the proportion of assets in

a portfolio that default. For instance, if the random variableX(k)i is equal to one if theith asset

defaults in yeark, then the yeark default rate is equal to

1

n

n∑i=1

X(k)i . (3)

For our homogeneous portfolio, the mean yeark default rate is simplypk, the marginal default

probability for yeark. Furthermore, the standard deviation of the yeark default rate (which we will

refer to as theyeark default rate volatility) is√pk,k − p2

k + (pk − pk,k)/n. (4)

Of interest to us is the large portfolio limit (that is,n → ∞) of this quantity, normalized by the

default probability. We will refer to this as thenormalized yeark default volatility, which is given

by √pk,k − p2

k

pk

. (5)

Additionally, we will examine thenormalized cumulative yeark default volatility, which is defined

similarly to the above, with the exception that the default rate is computed over the firstk years

rather than yeark only. The normalized cumulative default volatility is given by√qk,k − q2

k

qk

. (6)

Finally, we will use8 to denote the standard normal cumulative distribution function. In the

bivariate setting, we will use82(z1, z2; ρ) to indicate the probability thatZ1 < z1 andZ2 < z2,

whereZ1 andZ2 are standard normal random variables with correlationρ.

In the following four sections, we describe the models to be considered, and discuss in detail the

calibration to our initial data.

4

Page 12: Selection of Research Material relating to RiskMetrics Group CDO Manager

3 Discrete CreditMetrics extension

In its simplest form, the single period CreditMetrics model, calibrated for our homogeneous port-

folio, can be stated as follows:

(i) Define a default thresholdα such that8(α) = p1.

(ii) To each asseti, assign a standard normal random variableZ(i), where the correlation between

distinctZ(i) andZ(j) is equal toρ, such that

82(α, α; ρ) = p1,1. (7)

(iii) Asset i defaults in year 1 ifZ(i) < α.

The simplest extension of this model to multiple horizons is to simply repeat the one period model.

We then have default thresholdsα1, α2, . . . , αT corresponding to each period. For the first period,

we assign standard normal random variablesZ(i)1 to each asset as above, and asseti defaults in the

first period ifZ(i)1 < α1. For assets that survive the first period, we assign a second set of standard

normal random variablesZ(i)2 , such that the correlation between distinctZ

(i)2 andZ

(j)2 is ρ but the

variables from one period to the next are independent. Asseti then defaults in the second period

if Z(i)1 > α1 (it survives the first period) andZ(i)

2 < α2. The extension to subsequent periods

should be clear. In the end, the model is specified by the default thresholdsα1, α2, . . . , αT and the

correlation parameterρ.

To calibrate this model to our cumulative default probabilitiesq1, q2, . . . , qT and joint default

probability, we begin by setting the first period default threshold:

α1 = 8−1(q1). (8)

For subsequent periods, we setαk such that the probability thatZ(k)i < αk is equal to the conditional

default probability for periodk:

αk = 8−1(

qk − qk−1

1 − qk−1

). (9)

5

Page 13: Selection of Research Material relating to RiskMetrics Group CDO Manager

We complete the calibration by choosingρ to satisfy (7), withα replaced byα1.

The joint default probabilities and default volatilities are easily obtained in this context. For instance,

the marginal year two joint default probability is given by (for distincti andj ):

p2,2 = P{Z

(i)1 > α1 ∩ Z

(j)1 > α1 ∩ Z

(i)2 < α2 ∩ Z

(j)2 < α2

}

= P{Z

(i)1 > α1 ∩ Z

(j)1 > α1

}· P

{Z

(i)2 < α2 ∩ Z

(j)2 < α2

}

= (1 − 2p1 + p1,1) · 82(α2, α2; ρ). (10)

Similarly, the probability that asseti defaults in the first period, and assetj in the second period is

p1,2 = P{Z

(i)1 < α1 ∩ Z

(j)1 > α1 ∩ Z

(j)2 < α2

}= (p1 − p1,1) · q2 − p1

1 − p1. (11)

It is then possible to obtainq2,2 using (2) and the default volatilities using (5) and (6).

4 Diffusion-driven CreditMetrics extension

By construction, the discrete CreditMetrics extension above does not allow for any correlation of

default rates through time. For instance, if a high default rate is realized in the first period, this has

no bearing on the default rate in the second period, since the default drivers for the second period

(theZ(i)2 above) are independent of the default drivers for the first. Intuitively, we would not expect

this behavior from the market. If a high default rate occurs in one period, then it is likely that those

obligors that did not default would have generally decreased in credit quality. The impact would

then be that the default rate for the second period would also have a tendency to be high.

In order to capture this behavior, we introduce a CreditMetrics extension where defaults in con-

secutive periods are not driven by independent random variables, but rather by a single diffusion

process. Our diffusion-driven CreditMetrics extension is described by:

(i) Define default thresholdsα1, α2, . . . , αT for each period.

6

Page 14: Selection of Research Material relating to RiskMetrics Group CDO Manager

(ii) To each obligor, assign a standard Wiener processW(i), with W(i)0 = 0, where the instanta-

neous correlation between distinctW(i) andW(j) is ρ.4

(iii) Obligor i defaults in the first year ifW(i)1 < α1.

(iv) For k > 1, obligor i defaults in yeark if it survives the firstk − 1 years (that is,W(i)1 >

α1, . . . , W(i)k−1 > αk−1) andW

(i)k < αk.

Note that this approach allows for the behavior mentioned above. If the default rate is high in the

first year, this is because many of the Wiener processes have fallen below the thresholdα1. The

Wiener processes for non-defaulting obligors will have generally trended downward as well, since

all of the Wiener processes are correlated. This implies a greater likelihood of a high number of

defaults in the second year. In effect, then, this approach introduces a notion of credit migration.

Cases where the Wiener process trends downward but does not cross the default threshold can be

thought of as downgrades, while cases where the process trends upward are essentially upgrades.

To calibrate the first thresholdα1, we observe that

P{W

(i)1 < α1

}= 8(α1), (12)

and thus thatα1 is given by (8). For the second threshold, we require that the probability that an

obligor defaults in year two is equal top2:

P{W

(i)1 > α1 ∩ W

(i)2 < α2

}= p2. (13)

SinceW(i) is a Wiener process, we know that the standard deviation ofW(i)t is

√t and that for

s < t , the correlation betweenW(i)s andW

(i)t is

√s/t . Thus, givenα1, we find the value ofα2 that

satisfies

8(α2/√

2) − 82(α1, α2/√

2; √1/2) = p2. (14)

4Technically, the cross variation process forW(i) andW(j) is ρdt .

7

Page 15: Selection of Research Material relating to RiskMetrics Group CDO Manager

For thekth period, givenα1, . . . , αk−1, we calibrateαk by solving

P{W

(i)1 > α1 ∩ . . . ∩ W

(i)k−1 > αk−1 ∩ W

(i)k < αk

}= pk, (15)

again utilizing the properties of the Wiener processW(i) to compute the probability on the left hand

side.

We complete the calibration by findingρ such that the year one joint default probability isp1,1:

P{W

(i)1 < α1 ∩ W

(j)1 < α1

}= p1,1. (16)

SinceW(i)1 andW

(j)1 each follow a standard normal distribution, and have a correlation ofρ, the

solution forρ here is identical to that of the previous section.

With the calibration complete, it is a simple task to compute the joint default probabilities. For

instance, the joint year two default probability is given by

p2,2 = P{W

(i)1 > α1 ∩ W

(j)1 > α1 ∩ W

(i)2 < α2 ∩ W

(j)2 < α2

}, (17)

where we use the fact that{W(i)1 , W

(j)1 , W

(i)2 , W

(j)2 } follow a multivariate normal distribution with

covariance

Cov{W(i)1 , W

(j)1 , W

(i)2 , W

(j)2 } =

1 ρ 1 ρ

ρ 1 ρ 11 ρ 2 2ρρ 1 2ρ 2

. (18)

5 Copula functions

A drawback of both the CreditMetrics extensions above is that in a Monte Carlo setting, they require

a stepwise simulation approach. In other words, we must simulate the pool of assets over the first

year, tabulate the ones that default, then simulate the remaining assets over the second year, and so

on. Li (1999 and 2000) introduces an approach wherein it is possible to simulate the default times

directly, thus avoiding the need to simulate each period individually.

The normal copula function approach is as follows:

8

Page 16: Selection of Research Material relating to RiskMetrics Group CDO Manager

(i) Specify the cumulative default time distributionF , such thatF(t) gives the probability that a

given asset defaults prior to timet .

(ii) Assign a standard normal random variableZ(i) to each asset, where the correlation between

distinctZ(i) andZ(j) is ρ.

(iii) Obtain the default timeτi for asseti through

τi = F−1(8(Z(i))). (19)

Since we are concerned here only with the year in which an asset defaults, and not the precise

timing within the year, we will consider a discrete version of the copula approach:

(i) Specify the cumulative default probabilitiesq1, q2, . . . , qT as in Section 2.

(ii) For k = 1, . . . , T compute the thresholdαk = 8−1(qk). Clearly,α1 ≤ α2 ≤ . . . ≤ αT .

Defineα0 = −∞.

(iii) Assign Z(i) to each asset as above.

(iv) Asseti defaults in yeark if αk−1 < Z(i) ≤ αk.

The calibration to the cumulative default probabilities is already given. Further, it is easy to observe5

that the correlation parameterρ is calibrated exactly as in the previous two sections.

The joint default probabilities are perhaps simplest to obtain for this approach. For example, the

joint cumulative default probabilityqk,l is given by

qk,l = P{Z(i) < αk ∩ Z(j) < αl

}= 82(αk, αl; ρ). (20)

5Details are presented in Li (1999) and Li (2000).

9

Page 17: Selection of Research Material relating to RiskMetrics Group CDO Manager

6 Stochastic default intensity

6.1 Description of the model

The approaches of the three previous sections can all be thought of as extensions of the single

period CreditMetrics framework. Each approach relies on standard normal random variables to

drive defaults, and calibrates thresholds for these variables. Furthermore, it is easy to see that over

the first period, the three approaches are identical; they only differ in their behavior over multiple

periods.

Our fourth model takes a different approach to the construction of correlated defaults over time, and

can be thought of as an extension of the single period CreditRisk+ framework. In the CreditRisk+

model, correlations between default events are constructed through the assets’ dependence on a

common default probability, which itself is a random variable.6 Importantly, given the realization

of the default probability, defaults are conditionally independent. The volatility of the common

default probability is in effect the correlation parameter for this model; a higher default volatility

induces stronger correlations, while a zero volatility produces independent defaults.7

The natural extension of the CreditRisk+ framework to continuous time is the stochastic intensity

approach presented in Duffie and Garleanu (1998) and Duffie and Singleton (1999). Intuitively, the

stochastic intensity model stipulates that in a given small time interval, assets default independently,

with probability proportional to a common default intensity.8 In the next time interval, the intensity

changes, and defaults are once again independent, but with the default probability proportional to

the new intensity level. The evolution of the intensity is described through a stochastic process. In

practice, since the intensity must remain positive, it is common to apply similar stochastic processes

as are utilized in models of interest rates.

6More precisely, assets may depend on different default probabilities, each of which are correlated.

7See Finger (1998), Gordy (2000), and Kolyoglu and Hickman (1998) for further discussion.

8As with our description of the CreditRisk+ model, this is a simplification. The Duffie-Garleanu framework provides for anintensity process for each asset, with the processes being correlated.

10

Page 18: Selection of Research Material relating to RiskMetrics Group CDO Manager

For our purposes, we will model a single intensity processh. Conditional onh, the default time

for each asset is then the first arrival of a Poisson process with arrival rate given byh. The Poisson

processes driving the defaults for distinct assets are independent, meaning that given a realization

of the intensity processh, defaults are independent. The Poisson process framework implies that

givenh, the probability that a given asset survives until timet is

exp

[−

∫ t

0du hu

]. (21)

Further, because defaults are conditionally independent, the conditional probability, givenh, that

two assets both survive until timet is

exp

[−2

∫ t

0du hu

]. (22)

The unconditional survival probabilities are given by expectations over the processh, so that in

particular, the survival probability for a single asset is given by

1 − qt = E exp

[−

∫ t

0du hu

]. (23)

For the intensity process, we assume thath evolves according to the stochastic differential equation

dht = −κ(ht − h̄k)dt + σ√

htdWt, (24)

whereW is a Wiener process and̄hk is the level to which the process trends during yeark. (That

is, the mean reversion is towardh̄1 for t < 1, towardh̄2 for 1 ≤ t < 2, etc.) Leth0 = h̄1. Note

that this is essentially the model for the instantaneous discount rate used in the Cox-Ingersoll-Ross

interest rate model. Note also that in Duffie-Garleanu, there is a jump component to the evolution

of h, while the level of mean reversion is constant.

In order to express the default probabilities implied by the stochastic intensity model in closed

form, we will rely on the following result from Duffie-Garleanu.9 For a processh with h0 = h̄ and

9We have changed the notation slightly from the Duffie-Garleanu result, in order to make more explicit the dependence onh̄.

11

Page 19: Selection of Research Material relating to RiskMetrics Group CDO Manager

evolving according to (24) with̄hk = h̄ for all k, we have

Et exp

[−

∫ t+s

t

du hu

]exp[x + yhs ] = exp

[x + αs(y)h̄ + βs(y)ht

], (25)

whereEt denotes conditional expectation given information available at timet . The functionsαs

andβs are given by

αs(y) = κ

cs + κ(a(y)c − d(y))

bcd(y)log

[c + d(y)ebs

c + d

], and (26)

βs(y) = 1 + a(y)ebs

c + d(y)ebs, (27)

where

c = −κ + √κ2 + 2σ 2

2, (28)

d(y) = (1 − cy)σ 2y − κ + √

(σ 2y − κ)2 − σ 2(σ 2y2 − 2κy − 2)

σ 2y2 − 2κy − 2, (29)

a(y) = (d(y) + c)y − 1, (30)

b = −d(y)(κ + 2c) + a(y)(σ 2 − κc)

a(y)c − d(y). (31)

6.2 Calibration

Our calibration approach for this model will be to fix the mean reversion speedκ, solve forh̄1 and

σ to matchp1 andp1,1, and then to solve in turn for̄h2, . . . , h̄T to matchp2, . . . , pT . To begin,

we apply (23) and (25) to obtain

p1 = 1 − exp[α1(0)h̄1 + β1(0)h0

] = 1 − exp[[α1(0) + β1(0)]h̄1

]. (32)

To compute the joint probability that two obligors each survive the first year, we must take the

expectation of (22), which is essentially the same computation as above, but with the processh

replaced by 2h. We observe that the process 2h also evolves according to (24) with the same mean

reversion speedκ, and with h̄k replaced by 2̄hk andσ replaced byσ√

2. Thus, we define the

12

Page 20: Selection of Research Material relating to RiskMetrics Group CDO Manager

functionsα̂s andβ̂s in the same way asαs andβs , with σ replaced byσ√

2. We can then compute

the joint one year survival probability:

E exp

[−2

∫ t

0du hu

]= exp

[2[α̂1(0) + β̂1(0)]h̄1

]. (33)

Finally, since the joint survival probability is equal to 1− 2p1 + p1,1, we have

p1,1 = 2p1 − 1 + exp[2[α̂1(0) + β̂1(0)]h̄1

]. (34)

To calibrateσ andh̄1 to (32) and (34), we first find the value ofσ such that

2(α̂1(0) + β̂1(0))

α1(0) + β1(0)= log[1 − 2p1 + p1,1]

log[1 − p1], (35)

and then set

h̄1 = log[1 − p1]

α1(0) + β1(0). (36)

Note that though the equations are lengthy, the calibration is actually quite straightforward, in that

we only are ever required to fit one parameter at a time.

In order to calibrateh̄2, we need to obtain an expression for the two year cumulative default

probabilityq2. To this end, we must compute the two year survival probability

1 − q2 = E exp

[−

∫ 2

0du hu

]. (37)

Since the processh does not have a constant level of mean reversion over the first two years, we

cannot apply (25) directly here. However (25) can be applied once we express the two year survival

probability as

1 − q2 = E exp

[−

∫ 1

0du hu

]E1 exp

[−

∫ 2

1du hu

]. (38)

Now givenh1, the processh evolves according to (24) fromt = 1 to t = 2 with a constant mean

reversion level̄h2, meaning we can apply (25) to the conditional expectation in (38), yielding

1 − q2 = E exp

[−

∫ 1

0du hu

]exp

[α1(0)h̄2 + β1(0)h1

]. (39)

13

Page 21: Selection of Research Material relating to RiskMetrics Group CDO Manager

The same argument allows us to apply (25) again to (39), giving

1 − q2 = exp[α1(0)h̄2 + [α1(β1(0)) + β1(β1(0))]h̄1

]. (40)

Thus, our calibration for the second year requires setting

h̄2 = 1

α1(0)

{log[1 − q2] − [α1(β1(0)) + β1(β1(0))]h̄1

}. (41)

The remaining mean reversion levelsh̄3, . . . , h̄T are calibrated similarly.

6.3 Joint default probabilities

The computation of joint probabilities for longer horizons is similar to (34). The joint probability

that two obligors each survive the first two years is given by

E exp

[−2

∫ 2

0du hu

]. (42)

Here, we apply the same arguments as in (38) through (40) to derive

E exp

[−2

∫ 2

0du hu

]= exp

[2α̂1(0)h̄2 + 2[α̂1(β̂1(0)) + β̂1(β̂1(0))]h̄1

]. (43)

For the joint probability that the first obligor survives the first year and the second survives the first

two years, we must compute

E exp

[−

∫ 1

0du hu

]exp

[−

∫ 2

0du hu

]= E exp

[−2

∫ 1

0du hu

]exp

[−

∫ 2

1du hu

](44)

The same reasoning yields

E exp

[−

∫ 1

0du hu

]exp

[−

∫ 2

0du hu

]= exp

[α1(0)h̄2 + 2[α̂1(β̂1(0)/2) + β̂1(β̂1(0)/2)]h̄1

].

(45)

The joint default probabilitiesp2,2 andp1,2 then follow from (43) and (45).

14

Page 22: Selection of Research Material relating to RiskMetrics Group CDO Manager

7 Model comparisons – closed form results

Our first set of model comparisons will utilize the closed form results described in the previous

sections. We will restrict the comparisons here to the two period setting, and to second order results

(that is, default volatilities and joint probabilities for two assets); results for multiple periods and

actual distributions of default rates will be analyzed through Monte Carlo in the next section.

For our two period comparisons, we will analyze four sets of parameters: investment and speculative

grade default probabilities10, each with two correlation values. The low and high correlation settings

will correspond to values of 10% and 40%, respectively, for the asset correlation parameterρ in

the first three models. For the stochastic intensity model, we will investigate two values for the

mean reversion speedκ. The "slow" setting will correspond toκ = 0.29, such that a random shock

to the intensity process will decay by 25% over the next year; the "fast" setting will correspond

to κ = 1.39, such that a random shock to the intensity process will decay by 75% over one year.

Calibration results are presented in Table 1.

We present the normalized year two default volatilities for each model in Figure 1. As defined in (5)

and (6), the marginal and cumulative default volatilities are the standard deviation of the marginal

and cumulative two year default rates of a large, homogeneous portfolio. As we would expect, the

default volatilities are greater in the high correlation cases than in the low correlation cases. Of the

five models tested, the stochastic intensity model with slow mean reversion seems to produce the

highest levels of default volatility, indicating that correlations in the second period tend to be higher

for this model than for the others.

It is interesting to note that of the first three models, all of which are based on the normal distribution

and default thresholds, the copula approach in all four cases has a relatively low marginal default

volatility but a relatively high cumulative default volatility. (The slow stochastic intensity model is

in fact the only other model to show a marginal volatility less than the cumulative volatility.) Note

10Taken from Exhibit 30 of Keenan et al (2000).

15

Page 23: Selection of Research Material relating to RiskMetrics Group CDO Manager

that the cumulative two year default rate is the sum of the first and second year marginal default

rates, and thus that the two year cumulative default volatility is composed of three terms: the first

and second year marginal default volatilities and the covariance between the first and second years.

Our calibration guarantees that the first year default volatilities are identical across the models.

Thus, the behavior of the copula model suggests a stronger covariance term (that is, a stronger link

between year one and year two defaults) than for either of the two CreditMetrics extensions.

To further investigate the links between default events, we examine conditional probability of a

default in the second year, given the default of another asset. To be precise, for two distinct assetsi

andj , we will calculate the conditional probability that asseti defaults in year two, given that asset

j defaults in year one, normalized by the unconditional probability that asseti defaults in year two.

In terms of quantities we have already defined, this normalized conditional probability is equal to

p1,2/(p1p2). We will also calculate the normalized conditional probability that asseti defaults in

year two, given that assetj defaults in yeartwo, given byp2,2/p22. For both of these quantities, a

value of one indicates that the first asset defaulting does not affect the chance that the second asset

defaults; a value of four indicates that the second asset is four times more likely to default if the

first asset defaults than it is if we have no information about the first asset. Thus, the probability

conditional on a year two default can be interpreted as an indicator of contemporaneous correlation

of defaults, and the probability conditional on a year one default as an indicator of lagged default

correlation.

The normalized conditional probabilities under the five models are presented in Figure 2. As we

expect, there is no lagged correlation for the discrete CreditMetrics extension. Interestingly, the

copula and both stochastic intensity models often show a higher lagged than contemporaneous

correlation. While it is difficult to establish much intuition for the copula model, this phenomenon

can be rationalized in the stochastic intensity setting. For this model, any shock to the default

intensity will tend to persist longer than one year. If one asset defaults in the first year, it is most

likely due to a positive shock to the intensity process; this shock then persists into the second year,

where the other asset is more likely to default than normal. Further, shocks are more persistent for the

16

Page 24: Selection of Research Material relating to RiskMetrics Group CDO Manager

slower mean reversion, explaining why the difference in lagged and contemporaneous correlation

is more pronounced in this case. By contrast, the two CreditMetrics extensions show much higher

contemporaneous than lagged correlation; this lack of persistence in the correlation structure will

manifest itself more strongly over longer horizons.

To this point, we have calibrated the collection of models to have the same means over two periods,

and the same volatilities over one period. We have then investigated the remaining second order

statistics – the second period volatility and the correlation between the first and second periods – that

depend on the particular models. In the next section, we will extend the analysis on two fronts: first,

we will investigate more horizons in order to examine the effects of lagged and contemporaneous

correlations over longer times; second, we will investigate the entire distribution of portfolio defaults

rather than just the second order moments.

8 Model comparisons – simulation results

In this section, we perform Monte Carlo simulations for the five models investigated previously.

In each case, we begin with a homogeneous portfolio of one hundred speculative grade bonds. We

calibrate the model to the cumulative default probabilities in Table 2 and to the two correlation

settings from the previous section. Over 1,000 trials, we simulate the number of bonds that default

within each year, up to a final horizon of six years.11

The simulation procedures are straightforward for the two CreditMetrics extensions and the copula

approach. For the stochastic intensity framework, we simulate the evolution of the intensity process

according to (24). This requires a discretization of (24):

ht+1t ≈ −κ(ht − h̄k)1t + σ√

ht

√1tε, (46)

11As we have pointed out before, it is possible to simulate continuous default times under the copula and stochastic intensityframeworks. In order to compare with the two CreditMetrics extensions, we restrict the analysis to annual buckets.

17

Page 25: Selection of Research Material relating to RiskMetrics Group CDO Manager

whereε is a standard normal random variable.12 Given the intensity process path for a particular

scenario, we then compute the conditional survival probability for each annual period as in (21). Fi-

nally, we generate defaults by drawing independent binomial random variables with the appropriate

probability.

The simulation time for the five models is a direct result of the number of timesteps needed. The

copula model simulates the default times directly, and is therefore the fastest. The two CreditMetrics

models require only annual timesteps, and require roughly 50% more runtime than the copula model.

For the stochastic intensity model, the need to simulate over many timesteps produces a runtime

over one hundred times greater than the simpler models.

We first examine default rate volatilities over the six horizons. As in the previous section, we

consider the normalized cumulative default rate volatility. For yeark, this is the standard deviation

of the number of defaults that occur in years one throughk, divided by the expected number of

defaults in that period. This is essentially the quantity defined in (6), with the exception that

here we consider a finite portfolio. The default volatilities from our simulations are presented

in Figure 3. Our calibration guarantees that the first year default volatilities are essentially the

same. The second year results are similar to those in Figure 1, with slightly higher volatility for

the slow stochastic intensity model, and slightly lower volatility for the discrete CreditMetrics

extension. At longer horizons, these differences are amplified: the slow stochastic intensity and

discrete CreditMetrics models show high and low volatilities, respectively, while the remaining

three models are indistinguishable.

Thought default rate volatilities are illustrative, they do not provide us information about the full dis-

tribution of defaults through time. At the one year horizon, our calibration guarantees that volatility

will be consistent across the five models; the distribution assumptions, however influence the pre-

12Note that while (24) guarantees a non-negative solution forh, the discretized version admits a small probability thatht+1t willbe negative. To reduce this possibility, we choose1t for each timestep such that the probability thatht+1t < 0 is sufficiently small.The result is that while we only need 50 timesteps per year in some cases, we require as many as one thousand when the value ofσ

is large, as in the high correlation, fast mean reversion case.

18

Page 26: Selection of Research Material relating to RiskMetrics Group CDO Manager

cise shape of the portfolio distribution. We see in Table 3 that there is actually very little difference

between even the 1st percentiles of the distributions, particularly in the low correlation case. For

the full six year horizon, Table 4 shows more differences between the percentiles. Consistent with

the default volatility results, the tail percentiles are most extreme for the slow stochastic intensity

model, and least extreme for discrete CreditMetrics. Interestingly, though the CreditMetrics diffu-

sion model shows similar volatility to the copula and fast stochastic intensity models, it produces

less extreme percentiles than these other models. Note also that among distributions with similar

means, the median serves well as an indicator of skewness. The high correlation setting generally,

and the slow stochastic intensity model in particular, show lower medians. For these cases, the

distribution places higher probability on the worst default scenarios as well as the scenarios with

few or no defaults.

The cumulative probability distributions for the six year horizons are presented in Figures 4 through

7. As in the other comparisons, the slow stochastic intensity model is notable for placing large prob-

ability on the very low and high default rate scenarios, while the discrete CreditMetrics extension

stands out as the most benign of the distributions. Most striking, however, is the similarity between

the fast stochastic intensity and copula models, which are difficult to differentiate even at the most

extreme percentile levels.

As a final comparison of the default distributions, we consider the pricing of a simple structure

written on our portfolio. Suppose each of the one hundred bonds in the portfolio has a notional

value of $1 million, and that in the event of a default the recovery rate on each bond is forty percent.

The structure is composed of three elements:

(i) First loss protection. As defaults occur, the protection seller reimburses the structure up to a

total payment of $10 million. Thus, the seller pays $600,000 at the time of the first default,

$600,000 at the time of each of the subsequent fifteen defaults, and $400,000 at the time of

the seventeenth default.

(ii) Second loss protection. The protection seller reimburses the structure for losses in excess of

19

Page 27: Selection of Research Material relating to RiskMetrics Group CDO Manager

$10 million, up to a total payment of $20 million. This amounts to reimbursing the losses on

the seventeenth through the fiftieth defaults.

(iii) Senior notes. Notes with a notional value of $100 million maturing after six years. The notes

suffer a principal loss if the first and second loss protection are fully utilized – that is, if more

than fifty defaults occur.

For the first and second loss protection, we will estimate the cost of the protection based on a

constant discount rate of 7%. In each scenario, we produce the timing and amounts of the protection

payments, and discount these back to the present time. The price of the protection is then the average

discounted value across the 1,000 scenarios. For the senior notes, we compute the expected principal

loss at maturity, which is used by Moody’s along with Table 5 to determine the notes’ rating.

Additionally, we compute the total amount of protection (capital) required to achieve a rating of A3

(an expected loss of 0.5%) and Aa3 (an expected loss of 0.101%).

We present the first and second loss prices in Table 6, along with the expected loss, current rating,

and required capital for the senior notes. The slow stochastic intensity model yields the lowest

pricing for the first loss protection, the worst rating for the senior notes, and the highest required

capital. The results for the other models are as expected, with the copula and fast mean reversion

models yielding the most similar results.

9 Conclusion

The analysis of Collateralized Debt Obligations, and other structured products written on credit

portfolios, requires a model of correlated defaults over multiple horizons. For single horizon

models, the effect of model and distribution choice on the model results is well understood. For

the multiple horizon models, however, there has been little research.

We have outlined four approaches to multiple horizon modeling of defaults in a portfolio. We

have calibrated the four models to the same set of input data (average defaults and a single period

20

Page 28: Selection of Research Material relating to RiskMetrics Group CDO Manager

correlation parameter), and have investigated the resulting default distributions. The differences we

observe can be attributed to the model structures, and to some extent, to the choice of distributions

that drive the models. Our results show a significant disparity. The rating on a class of senior

notes under our low correlation assumption varied from Aaa to A3, and under our high correlation

assumption from A1 to Baa3. Additionally, the capital required to achieve a target investment grade

rating varied by as much as a factor of two.

In the single period case, a number of studies have concluded that when calibrated to the same

first and second order information, the various models do not produce vastly different conclusions.

Here, the issue of model choice is much more important, and any analysis of structures over multiple

horizons should heed this potential model error.

References

Cifuentes, A., Choi, E., and Waite, J. (1998).Stability of rations of CBO/CLO tranches. Moody’s

Investors Service.

Credit Suisse Financial Products. (1997).CreditRisk+: A credit risk management framework.

Duffie, D. and Garleanu, N. (1998). Risk and valuation of Collateralized Debt Obligations. Working

paper. Graduate School of Business, Stanford University.

http://www.stanford.edu/ ˜ duffie/working.htm

Duffie, D. and Singleton, K. (1998). Simulating correlated defaults. Working paper. Graduate

School of Business, Stanford University.

http://www.stanford.edu/ ˜ duffie/working.htm

Duffie, D. and Singleton, K. (1999). Modeling term structures of defaultable bonds.Review of

Financial Studies, 12, 687-720.

21

Page 29: Selection of Research Material relating to RiskMetrics Group CDO Manager

Finger, C. (1998). Sticks and stones. Working paper. RiskMetrics Group.

http://www.riskmetrics.com/research/working

Gordy, M. (2000). A comparative anatomy of credit risk models.Journal of Banking & Finance,

24 (January), 119-149.

Gupton, G., Finger, C., and Bhatia, M. (1997).CreditMetrics – Technical Document. Morgan

Guaranty Trust Co. http://www.riskmetrics.com/research/techdoc

Li, D. (1999). The valuation of basket credit derivatives.CreditMetrics Monitor, April, 34-50.

http://www.riskmetrics.com/research/journals

Li, D. (2000). On default correlation: a copula approach.The Journal of Fixed Income, 9 (March),

43-54.

Keenan, S., Hamilton, D. and Berthault, A. (2000).Historical default rates of corporate bond

issuers, 1920-1999. Moody’s Investors Service.

Kolyoglu, U. and Hickman, A. (1998). Reconcilable differences.Risk, October.

Nagpal, K. and Bahar, R. (1999). An analytical approach for credit risk analysis under correlated de-

faults.CreditMetrics Monitor,April, 51-74. http://www.riskmetrics.com/research/journals

Standard & Poor’s. (2000).Ratings performance 1999: Stability & Transition.

Wilson, T. (1997). Portfolio Credit Risk I.Risk, September.

Wilson, T. (1997). Portfolio Credit Risk II.Risk, October.

22

Page 30: Selection of Research Material relating to RiskMetrics Group CDO Manager

Table 1: Calibration results.

Investment grade Speculative gradeParameter Low correlation High correlation Low correlation High correlationInputs

p1 0.16% 0.16% 3.35% 3.35%p2 0.33% 0.33% 3.41% 3.41%p1,1 0.0007% 0.0059% 0.1776% 0.5190%

Discrete CreditMetrics extensionα1 -2.95 -2.95 -1.83 -1.83α2 -2.72 -2.72 -1.81 -1.81ρ 10% 40% 10% 40%

Diffusion CreditMetrics extensionα1 -2.95 -2.95 -1.83 -1.83α2 -3.78 -3.78 -2.34 -2.34ρ 10% 40% 10% 40%

Copula functionsα1 -2.95 -2.95 -1.83 -1.83α2 -2.58 -2.58 -1.49 -1.49ρ 10% 40% 10% 40%

Stochastic intensity – slow mean reversionκ 0.29 0.29 0.29 0.29σ 0.10 0.37 0.28 0.76h̄1 0.16% 0.16% 3.44% 3.67%h̄2 1.47% 1.58% 6.06% 12.10%

Stochastic intensity – fast mean reversionκ 1.39 1.39 1.39 1.39σ 0.14 0.53 0.40 1.12h̄1 0.16% 0.16% 3.44% 3.68%h̄2 0.53% 0.55% 4.00% 5.02%

Table 2: Moody’s speculative grade cumulative default probabilities. From Exhibit 30, Keenan et al (2000).

Year 1 2 3 4 5 6Probability 3.35% 6.76% 9.98% 12.89% 15.57% 17.91%

23

Page 31: Selection of Research Material relating to RiskMetrics Group CDO Manager

Table 3: One year default statistics. Speculative grade.

CreditMetrics CreditMetrics Stoch. Int. Stoch. Int.Statistic Discrete Diffusion Copula Slow FastLow correlationMean 3.37 3.36 3.51 3.20 3.20St. Dev. 3.15 3.27 3.40 3.03 3.05Median 3 2 3 3 25th percentile 10 9 10 9 101st percentile 14 15 15 13 14High correlationMean 3.62 3.24 3.72 3.69 3.56St. Dev. 7.08 6.32 7.52 6.84 6.73Median 1 1 1 1 15th percentile 19 15 19 19 161st percentile 37 32 34 30 35

Table 4: Six year cumulative default statistics. Speculative grade.

CreditMetrics CreditMetrics Stoch. Int. Stoch. Int.Statistic Discrete Diffusion Copula Slow FastLow correlationMean 17.72 16.93 18.04 17.34 18.10St. Dev. 6.40 8.68 9.66 16.15 9.73Median 17 16 17 12 165th percentile 29 33 37 52 371st percentile 34 42 47 73 49High correlationMean 18.41 17.28 18.61 19.81 20.41St. Dev. 13.49 17.41 19.27 24.37 19.36Median 15 12 12 9 135th percentile 45 54 63 82 621st percentile 59 73 78 98 86

24

Page 32: Selection of Research Material relating to RiskMetrics Group CDO Manager

Table 5: Target expected losses for six year maturity. From Chart 3, Cifuentes et al (2000).

Rating Expected lossAaa 0.002%Aa1 0.023%Aa2 0.048%Aa3 0.101%A1 0.181%A2 0.320%A3 0.500%

Baa1 0.753%Baa2 1.083%Baa3 2.035%

Table 6: Prices (in $M) for first and second loss protection. Expected loss, rating, and required capital ($M)for senior notes. Speculative grade collateral.

Senior notesFirst loss Second lossExp. loss Rating Capital (Aa3) Capital (A3)

Low correlationCM Discrete 7.227 1.350 0.000% Aaa 17.3 13.8CM Diffusion 6.676 1.533 0.017% Aa1 21.6 15.9Copula 6.788 1.936 0.022% Aa1 24.5 18.0Stoch. int. – slow 5.533 2.501 0.466% A3 39.8 29.4Stoch. int. – fast 6.763 1.911 0.038% Aa2 25.7 18.3High correlationCM Discrete 6.117 2.698 0.159% A1 32.3 23.6CM Diffusion 5.144 2.832 0.514% Baa1 41.1 30.2Copula 5.210 3.200 0.821% Baa2 43.7 34.4Stoch. int. – slow 4.856 3.307 1.903% Baa3 54.5 46.1Stoch. int. – fast 5.685 3.500 0.918% Baa2 45.9 35.2

25

Page 33: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 1: Marginal and cumulative year two default volatility.

CM CM Copula Stoch int Stoch int0

0.2

0.4

0.6

0.8

1

1.2

1.4Investment grade, low correlation

Discrete Diffusion Slow Fast

Marginal Cumulative

CM CM Copula Stoch int Stoch int0

0.5

1

1.5

2

2.5

3

3.5

4Investment grade, high correlation

Discrete Diffusion Slow Fast

Marginal Cumulative

CM CM Copula Stoch int Stoch int0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1Speculative grade, low correlation

Discrete Diffusion Slow Fast

Marginal Cumulative

CM CM Copula Stoch int Stoch int0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2Speculative grade, high correlation

Discrete Diffusion Slow Fast

Marginal Cumulative

26

Page 34: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 2: Year two conditional default probability given default of a second asset.

CM CM Copula Stoch int Stoch int0

0.5

1

1.5

2

2.5Investment grade, low correlation

Discrete Diffusion Slow Fast

Cond on 1st yr defaultCond on 2nd yr default

CM CM Copula Stoch int Stoch int0

2

4

6

8

10

12

14

16

18Investment grade, high correlation

Discrete Diffusion Slow Fast

Cond on 1st yr defaultCond on 2nd yr default

CM CM Copula Stoch int Stoch int0

0.2

0.4

0.6

0.8

1

1.2

1.4

1.6

1.8

2Speculative grade, low correlation

Discrete Diffusion Slow Fast

Cond on 1st yr defaultCond on 2nd yr default

CM CM Copula Stoch int Stoch int0

1

2

3

4

5

6Speculative grade, high correlation

Discrete Diffusion Slow Fast

Cond on 1st yr defaultCond on 2nd yr default

27

Page 35: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 3: Normalized cumulative default rate volatilities. Speculative grade.

1 2 3 4 5 60.5

1

1.5

2

2.5

Time

Def

ault

vola

tility

High correlation

CM Discrete CM DiffusionCopula St.Int. SlowSt.Int. Fast

1 2 3 4 5 60.2

0.4

0.6

0.8

1

1.2

Def

ault

vola

tility

Low correlation

28

Page 36: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 4: Distribution of cumulative six year defaults. Speculative grade, low correlation.

0 10 20 30 40 50 60 70 80 90 1000

20%

40%

60%

80%

100%

Defaults

Cum

ulat

ive

prob

abili

ty

CM Discrete CM DiffusionCopula St.Int. SlowSt.Int. Fast

29

Page 37: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 5: Distribution of cumulative six year defaults, extreme cases. Speculative grade, low correlation.

20 30 40 50 60 70 80 90 10080%

84%

88%

92%

96%

100%

Defaults

Cum

ulat

ive

prob

abili

ty

CM Discrete CM DiffusionCopula St.Int. SlowSt.Int. Fast

30

Page 38: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 6: Distribution of cumulative six year defaults. Speculative grade, high correlation.

0 10 20 30 40 50 60 70 80 90 1000

20%

40%

60%

80%

100%

Defaults

Cum

ulat

ive

prob

abili

ty

CM Discrete CM DiffusionCopula St.Int. SlowSt.Int. Fast

31

Page 39: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 7: Distribution of cumulative six year defaults, extreme cases. Speculative grade, high correlation.

20 30 40 50 60 70 80 90 10080%

84%

88%

92%

96%

100%

Defaults

Cum

ulat

ive

prob

abili

ty

CM Discrete CM DiffusionCopula St.Int. SlowSt.Int. Fast

32

Page 40: Selection of Research Material relating to RiskMetrics Group CDO Manager

The RiskMetrics GroupWorking Paper Number 99-07

On Default Correlation: A Copula Function Approach

David X. Li

This draft: February 2000First draft: September 1999

44 Wall St.New York, NY 10005

[email protected]

Page 41: Selection of Research Material relating to RiskMetrics Group CDO Manager

On Default Correlation: A Copula Function Approach

David X. Li

February 2000

Abstract

This paper studies the problem of default correlation. We first introduce a random variable called “time-until-default” to denote the survival time of each defaultable entity or financial instrument, and define thedefault correlation between two credit risks as the correlation coefficient between their survival times.Then we argue why a copula function approach should be used to specify the joint distribution of survivaltimes after marginal distributions of survival times are derived from market information, such as riskybond prices or asset swap spreads. The definition and some basic properties of copula functions aregiven. We show that the current CreditMetrics approach to default correlation through asset correlationis equivalent to using a normal copula function. Finally, we give some numerical examples to illustratethe use of copula functions in the valuation of some credit derivatives, such as credit default swaps andfirst-to-default contracts.

Page 42: Selection of Research Material relating to RiskMetrics Group CDO Manager

1 Introduction

The rapidly growing credit derivative market has created a new set of financial instruments which can be

used to manage the most important dimension of financial risk - credit risk. In addition to the standard

credit derivative products, such as credit default swaps and total return swaps based upon a single underlying

credit risk, many new products are now associated with a portfolio of credit risks. A typical example is the

product with payment contingent upon the time and identity of the first or second-to-default in a given credit

risk portfolio. Variations include instruments with payment contingent upon the cumulative loss before a

given time in the future. The equity tranche of a collateralized bond obligation (CBO) or a collateralized

loan obligation (CLO) is yet another variation, where the holder of the equity tranche incurs the first loss.

Deductible and stop-loss in insurance products could also be incorporated into the basket credit derivatives

structure. As more financial firms try to manage their credit risk at the portfolio level and the CBO/CLO

market continues to expand, the demand for basket credit derivative products will most likely continue to

grow.

Central to the valuation of the credit derivatives written on a credit portfolio is the problem of default

correlation. The problem of default correlation even arises in the valuation of a simple credit default swap

with one underlying reference asset if we do not assume the independence of default between the reference

asset and the default swap seller. Surprising though it may seem, the default correlation has not been well

defined and understood in finance. Existing literature tends to define default correlation based on discrete

events which dichotomize according to survival or nonsurvival at a critical period such as one year. For

example, if we denote

qA = Pr[EA], qB = Pr[EB], qAB = Pr[EAEB]

where EA, EB are defined as the default events of two securities A and B over 1 year. Then the default

correlation ρ between two default events EA and EB , based on the standard definition of correlation of two

random variables, are defined as follows

1

Page 43: Selection of Research Material relating to RiskMetrics Group CDO Manager

ρ = qAB − qA · qB√qA(1 − qA)qB(1 − qB) . (1)

This discrete event approach has been taken by Lucas [1995]. Hereafter we simply call this definition of

default correlation the discrete default correlation.

However the choice of a specific period like one year is more or less arbitrary. It may correspond with many

empirical studies of default rate over one year period. But the dependence of default correlation on a specific

time interval has its disadvantages. First, default is a time dependent event, and so is default correlation. Let

us take the survival time of a human being as an example. The probability of dying within one year for a

person aged 50 years today is about 0.6%, but the probability of dying for the same person within 50 years is

almost a sure event. Similarly default correlation is a time dependent quantity. Let us now take the survival

times of a couple, both aged 50 years today. The correlation between the two discrete events that each dies

within one year is very small. But the correlation between the two discrete events that each dies within 100

years is 1. Second, concentration on a single period of one year wastes important information. There are

empirical studies which show that the default tendency of corporate bonds is linked to their age since issue.

Also there are strong links between the economic cycle and defaults. Arbitrarily focusing on a one year period

neglects this important information. Third, in the majority of credit derivative valuations, what we need is

not the default correlation of two entities over the next year. We may need to have a joint distribution of

survival times for the next 10 years. Fourth, the calculation of default rates as simple proportions is possible

only when no samples are censored during the one year period1.

This paper introduces a few techniques used in survival analysis. These techniques have been widely applied

to other areas, such as life contingencies in actuarial science and industry life testing in reliability studies,

which are similar to the credit problems we encounter here. We first introduce a random variable called

1A company who is observed, default free, by Moody’s for 5-years and then withdrawn from the Moody’s study must havea survival time exceeding 5 years. Another company may enter into Moody’s study in the middle of a year, which implies thatMoody’s observes the company for only half of the one year observation period. In the survival analysis of statistics, such incompleteobservation of default time is called censoring. According to Moody’s studies, such incomplete observation does occur in Moody’scredit default samples.

2

Page 44: Selection of Research Material relating to RiskMetrics Group CDO Manager

“time-until-default” to denote the survival time of each defaultable entity or financial instrument. Then,

we define the default correlation of two entities as the correlation between their survival times. In credit

derivative valuation we need first to construct a credit curve for each credit risk. A credit curve gives all

marginal conditional default probabilities over a number of years. This curve is usually derived from the

risky bond spread curve or asset swap spreads observed currently from the market. Spread curves and asset

swap spreads contain information on default probabilities, recovery rate and liquidity factors etc. Assuming

an exogenous recovery rate and a default treatment, we can extract a credit curve from the spread curve or

asset swap spread curve. For two credit risks, we would obtain two credit curves from market observable

information. Then, we need to specify a joint distribution for the survival times such that the marginal

distributions are the credit curves. Obviously, this problem has no unique solution. Copula functions used in

multivariate statistics provide a convenient way to specify the joint distribution of survival times with given

marginal distributions. The concept of copula functions, their basic properties, and some commonly used

copula functions are introduced. Finally, we give a few numerical examples of credit derivative valuation to

demonstrate the use of copula functions and the impact of default correlation.

2 Characterization of Default by Time-Until-Default

In the study of default, interest centers on a group of individual companies for each of which there is defined

a point event, often called default, (or survival) occurring after a length of time. We introduce a random

variable called the time-until-default, or simply survival time, for a security, to denote this length of time.

This random variable is the basic building block for the valuation of cash flows subject to default.

To precisely determine time-until-default, we need: an unambiguously defined time origin, a time scale for

measuring the passage of time, and a clear definition of default.

We choose the current time as the time origin to allow use of current market information to build credit

curves. The time scale is defined in terms of years for continuous models, or number of periods for discrete

models. The meaning of default is defined by some rating agencies, such as Moody’s.

3

Page 45: Selection of Research Material relating to RiskMetrics Group CDO Manager

2.1 Survival Function

Let us consider an existing securityA. This security’s time-until-default, TA, is a continuous random variable

which measures the length of time from today to the time when default occurs. For simplicity we just use T

which should be understood as the time-until-default for a specific securityA. LetF(t) denote the distribution

function of T ,

F(t) = Pr(T ≤ t), t ≥ 0 (2)

and set

S(t) = 1 − F(t) = Pr(T > t), t ≥ 0. (3)

We also assume that F(0) = 0, which implies S(0) = 1. The function S(t) is called the survival function.

It gives the probability that a security will attain age t . The distribution of TA can be defined by specifying

either the distribution function F(t) or the survival function S(t). We can also define a probability density

function as follows

f (t) = F ′(t) = −S ′(t) = lim�→0+

Pr[t ≤ T < t +�]

�.

To make probability statements about a security which has survived x years, the future life time for this

security is T − x|T > x. We introduce two more notations

t qx = Pr[T − x ≤ t |T > x], t ≥ 0

tpx = 1 − t qx = Pr[T − x > t |T > x], t ≥ 0. (4)

The symbol t qx can be interpreted as the conditional probability that the security A will default within the

next t years conditional on its survival for x years. In the special case of X = 0, we have

tp0 = S(t) x ≥ 0.

4

Page 46: Selection of Research Material relating to RiskMetrics Group CDO Manager

If t = 1, we use the actuarial convention to omit the prefix 1 in the symbols t qx and tpx , and we have

px = Pr[T − x > 1|T > x]

qx = Pr[T − x ≤ 1|T > x].

The symbol qx is usually called the marginal default probability, which represents the probability of default

in the next year conditional on the survival until the beginning of the year. A credit curve is then simply

defined as the sequence of q0, q1, · · · , qn in discrete models.

2.2 Hazard Rate Function

The distribution function F(t) and the survival function S(t) provide two mathematically equivalent ways

of specifying the distribution of the random variable time-until-default, and there are many other equiva-

lent functions. The one used most frequently by statisticians is the hazard rate function which gives the

instantaneous default probability for a security that has attained age x.

Pr[x < T ≤ x +�x|T > x] = F(x +�x)− F(x)1 − F(x)

≈ f (x)�x

1 − F(x) .

The function

f (x)

1 − F(x)has a conditional probability density interpretation: it gives the value of the conditional probability density

function of T at exact age x, given survival to that time. Let’s denote it as h(x), which is usually called

the hazard rate function. The relationship of the hazard rate function with the distribution function and

survival function is as follows

5

Page 47: Selection of Research Material relating to RiskMetrics Group CDO Manager

h(x) = f (x)

1 − F(x) = −S′(x)S(x)

. (5)

Then, the survival function can be expressed in terms of the hazard rate function,

S(t) = e−∫ t

0 h(s)ds .

Now, we can express t qx and tpx in terms of the hazard rate function as follows

tpx = e−∫ t

0 h(s+x)ds, (6)

t qx = 1 − e−∫ t

0 h(s+x)ds .

In addition,

F(t) = 1 − S(t) = 1 − e−∫ t

0 h(s)ds,

and

f (t) = S(t) · h(t). (7)

which is the density function for T .

A typical assumption is that the hazard rate is a constant, h, over certain period, such as [x, x + 1]. In this

case, the density function is

f (t) = he−ht

6

Page 48: Selection of Research Material relating to RiskMetrics Group CDO Manager

which shows that the survival time follows an exponential distribution with parameter h. Under this assump-

tion, the survival probability over the time interval [x, x + t] for 0 < t ≤ 1 is

tpx = 1 − t qx = e−∫ t

0 h(s)ds = e−ht = (px)t

where px is the probability of survival over one year period. This assumption can be used to scale down the

default probability over one year to a default probability over a time interval less than one year.

Modelling a default process is equivalent to modelling a hazard function. There are a number of reasons why

modelling the hazard rate function may be a good idea. First, it provides us information on the immediate

default risk of each entity known to be alive at exact age t . Second, the comparisons of groups of individuals

are most incisively made via the hazard rate function. Third, the hazard rate function based model can be

easily adapted to more complicated situations, such as where there is censoring or there are several types

of default or where we would like to consider stochastic default fluctuations. Fourth, there are a lot of

similarities between the hazard rate function and the short rate. Many modeling techniques for the short rate

processes can be readily borrowed to model the hazard rate.

Finally, we can define the joint survival function for two entities A and B based on their survival times TA

and TB ,

STATB (s, t) = Pr[TA > s, TB > t].

The joint distributional function is

F(s, t) = Pr[TA ≤ s, TB ≤ t]= 1 − STA(s)− STB (t)+ STATB (s, t).

The aforementioned concepts and results can be found in survival analysis books, such as Bowers et al.

[1997], Cox and Oakes [1984].

7

Page 49: Selection of Research Material relating to RiskMetrics Group CDO Manager

3 Definition of Default Correlations

The default correlation of two entities A and B can then be defined with respect to their survival times TA

and TB as follows

ρAB = Cov(TA, TB)√V ar(TA)V ar(TB)

= E(TATB)− E(TA)E(TB)√V ar(TA)V ar(TB)

. (8)

Hereafter we simply call this definition of default correlation the survival time correlation. The survival

time correlation is a much more general concept than that of the discrete default correlation based on a one

period. If we have the joint distribution f (s, t) of two survival times TA, TB , we can calculate the discrete

default correlation. For example, if we define

E1 = [TA < 1],

E2 = [TB < 1],

then the discrete default correlation can be calculated using equation (1) with the following calculation

q12 = Pr[E1E2] =∫ 1

0

∫ 1

0f (s, t)dsdt

q1 =∫ 1

0fA(s)ds

q2 =∫ 1

0fB(t)dt.

However, knowing the discrete default correlation over one year period does not allow us to specify the

survival time correlation.

4 The Construction of the Credit Curve

The distribution of survival time or time-until-default can be characterized by the distribution function,

survival function or hazard rate function. It is shown in Section 2 that all default probabilities can be

8

Page 50: Selection of Research Material relating to RiskMetrics Group CDO Manager

calculated once a characterization is given. The hazard rate function used to characterize the distribution of

survival time can also be called a credit curve due to its similarity to a yield curve. But the basic question is:

how do we obtain the credit curve or the distribution of survival time for a given credit?

There exist three methods to obtain the term structure of default rates:

(i) Obtaining historical default information from rating agencies;

(ii) Taking the Merton option theoretical approach;

(iii) Taking the implied approach using market prices of defaultable bonds or asset swap spreads.

Rating agencies like Moody’s publish historical default rate studies regularly. In addition to the commonly

cited one-year default rates, they also present multi-year default rates. From these rates we can obtain the

hazard rate function. For example, Moody’s (see Carty and Lieberman [1997]) publishes weighted average

cumulative default rates from 1 to 20 years. For the B rating, the first 5 years cumulative default rates in

percentage are 7.27, 13.87, 19.94, 25.03 and 29.45. From these rates we can obtain the marginal conditional

default probabilities. The first marginal conditional default probability in year one is simply the one-year

default probability, 7.27%. The other marginal conditional default probabilities can be obtained using the

following formula:

n+1qx = nqx + npx · qx+n, (9)

which simply states that the probability of default over time interval [0, n+ 1] is the sum of the probability

of default over the time interval [0, n], plus the probability of survival to the end of nth year and default in

the following year. Using equation (9) we have the marginal conditional default probability:

qx+n = n+1qx − nqx

1 − nqx

which results in the marginal conditional default probabilities in year 2, 3, 4, 5 as 7.12%, 7.05%, 6.36% and

5.90%. If we assume a piecewise constant hazard rate function over each year, then we can obtain the hazard

rate function using equation (6). The hazard rate function obtained is given in Figure (1).

9

Page 51: Selection of Research Material relating to RiskMetrics Group CDO Manager

Using diffusion processes to describe changes in the value of the firm, Merton [1974] demonstrated that a

firm’s default could be modeled with the Black and Scholes methodology. He showed that stock could be

considered as a call option on the firm with strike price equal to the face value of a single payment debt.

Using this framework we can obtain the default probability for the firm over one period, from which we

can translate this default probability into a hazard rate function. Geske [1977] and Delianedis and Geske

[1998] extended Merton’s analysis to produce a term structure of default probabilities. Using the relationship

between the hazard rate and the default probabilities we can obtain a credit curve.

Alternatively, we can take the implicit approach by using market observable information, such as asset swap

spreads or risky corporate bond prices. This is the approach used by most credit derivative trading desks. The

extracted default probabilities reflect the market-agreed perception today about the future default tendency of

the underlying credit. Li [1998] presents one approach to building the credit curve from market information

based on the Duffie and Singleton [1996] default treatment. In that paper the author assumes that there exists

a series of bonds with maturity 1, 2, .., n years, which are issued by the same company and have the same

seniority. All of those bonds have observable market prices. From the market price of these bonds we can

calculate their yields to maturity. Using the yield to maturity of corresponding treasury bonds we obtain a

yield spread curve over treasury (or asset swap spreads for a yield spread curve over LIBOR). The credit

curve construction is based on this yield spread curve and an exogenous assumption about the recovery rate

based on the seniority and the rating of the bonds, and the industry of the corporation.

The suggested approach is contrary to the use of historical default experience information provided by rating

agencies such as Moody’s. We intend to use market information rather than historical information for the

following reasons:

• The calculation of profit and loss for a trading desk can only be based on current market information.

This current market information reflects the market agreed perception about the evolution of the market

in the future, on which the actual profit and loss depend. The default rate derived from current market

information may be much different than historical default rates.

• Rating agencies use classification variables in the hope that homogeneous risks will be obtained

10

Page 52: Selection of Research Material relating to RiskMetrics Group CDO Manager

after classification. This technique has been used elsewhere like in pricing automobile insurance.

Unfortunately, classification techniques omit often some firm specific information. Constructing a

credit curve for each credit allows us to use more firm specific information.

• Rating agencies reacts much slower than the market in anticipation of future credit quality. A typical

example is the rating agencies reaction to the recent Asian crisis.

• Ratings are primarily used to calculate default frequency instead of default severity. However, much

of credit derivative value depends on both default frequency and severity.

• The information available from a rating agency is usually the one year default probability for each rating

group and the rating migration matrix. Neither the transition matrixes, nor the default probabilities

are necessarily stable over long periods of time. In addition, many credit derivative products have

maturities well beyond one year, which requires the use of long term marginal default probability.

It is shown under the Duffie and Singleton approach that a defaultable instrument can be valued as if it is a

default free instrument by discounting the defaultable cash flow at a credit risk adjusted discount factor. The

credit risk adjusted discount factor or the total discount factor is the product of risk-free discount factor and

the pure credit discount factor if the underlying factors affecting default and those affecting the interest rate

are independent. Under this framework and the assumption of a piecewise constant hazard rate function, we

can derive a credit curve or specify the distribution of the survival time.

5 Dependent Models - Copula Functions

Let us study some problems of an n credit portfolio. Using either the historical approach or the market

implicit approach, we can construct the marginal distribution of survival time for each of the credit risks in

the portfolio. If we assume mutual independence among the credit risks, we can study any problem associated

with the portfolio. However, the independence assumption of the credit risks is obviously not realistic; in

reality, the default rate for a group of credits tends to be higher in a recession and lower when the economy

11

Page 53: Selection of Research Material relating to RiskMetrics Group CDO Manager

is booming. This implies that each credit is subject to the same set of macroeconomic environment, and that

there exists some form of positive dependence among the credits. To introduce a correlation structure into

the portfolio, we must determine how to specify a joint distribution of survival times, with given marginal

distributions.

Obviously, this problem has no unique solution. Generally speaking, knowing the joint distribution of

random variables allows us to derive the marginal distributions and the correlation structure among the

random variables, but not vice versa. There are many different techniques in statistics which allow us to

specify a joint distribution function with given marginal distributions and a correlation structure. Among

them, copula function is a simple and convenient approach. We give a brief introduction to the concept of

copula function in the next section.

5.1 Definition and Basic Properties of Copula Function

A copula function is a function that links or marries univariate marginals to their full multivariate distribution.

For m uniform random variables, U1, U2, · · · ,Um, the joint distribution function C, defined as

C(u1, u2, · · · , um, ρ) = Pr[U1 ≤ u1, U2 ≤ u2, · · · , Um ≤ um]

can also be called a copula function.

Copula functions can be used to link marginal distributions with a joint distribution. For given univariate

marginal distribution functions F1(x1), F2(x2),· · · , Fm(xm), the function

C(F1(x1), F2(x2), · · · , Fm(xm)) = F(x1, x2, · · · xm),

which is defined using a copula function C, results in a multivariate distribution function with univariate

marginal distributions as specified F1(x1), F2(x2),· · · , Fm(xm).

This property can be easily shown as follows:

12

Page 54: Selection of Research Material relating to RiskMetrics Group CDO Manager

C(F1(x1), F2(x2), · · · , Fm(xm), ρ) = Pr [U1 ≤ F1(x1), U2 ≤ F2(x2), · · · , Um ≤ Fm(xm)]= Pr

[F−1

1 (U1) ≤ x1, F−12 (U2) ≤ x2, · · · , F−1

m (Um) ≤ xm]

= Pr [X1 ≤ x1, X2 ≤ x2, · · · , Xm ≤ xm]

= F(x1, x2, · · · xm).

The marginal distribution of Xi is

C(F1(+∞), F2(+∞), · · ·Fi(xi), · · · , Fm(+∞), ρ)= Pr [X1 ≤ +∞, X2 ≤ +∞, · · · , Xi ≤ xi, Xm ≤ +∞]

= Pr[Xi ≤ xi]= Fi(xi).

Sklar [1959] established the converse. He showed that any multivariate distribution function F can be

written in the form of a copula function. He proved the following: If F(x1, x2, · · · xm) is a joint multivariate

distribution function with univariate marginal distribution functions F1(x1), F2(x2),· · · , Fm(xm), then there

exists a copula function C(u1, u2, · · · , um) such that

F(x1, x2, · · · xm) = C(F1(x1), F2(x2), · · · , Fm(xm)).

If each Fi is continuous then C is unique. Thus, copula functions provide a unifying and flexible way to

study multivariate distributions.

For simplicity’s sake, we discuss only the properties of bivariate copula functions C(u, v, ρ) for uniform

random variables U and V , defined over the area {(u, v)|0 < u ≤ 1, 0 < v ≤ 1}, where ρ is a correlation

parameter. We call ρ simply a correlation parameter since it does not necessarily equal the usual correlation

coefficient defined by Pearson, nor Spearman’s Rho, nor Kendall’s Tau. The bivariate copula function has

the following properties:

(i) Since U and V are positive random variables, C(0, v, ρ) = C(u, 0, ρ) = 0.

13

Page 55: Selection of Research Material relating to RiskMetrics Group CDO Manager

(ii) Since U and V are bounded above by 1, the marginal distributions can be obtained by C(1, v, ρ) = v,

C(u, 1, ρ) = u.

(iii) For independent random variables U and V , C(u, v, ρ) = uv.

Frechet [1951] showed there exist upper and lower bounds for a copula function

max(0, u+ v − 1) ≤ C(u, v) ≤ min(u, v).

The multivariate extension of Frechet bounds is given by Dall’Aglio [1972].

5.2 Some Common Copula Functions

We present a few copula functions commonly used in biostatistics and actuarial science.

Frank Copula The Frank copula function is defined as

C(u, v) = 1

αln

[1 + (eαu − 1)(eαv − 1)

eα − 1

], −∞ < α <∞.

Bivariate Normal

C(u, v) = %2(%−1(u),%−1(v), ρ), −1 ≤ ρ ≤ 1, (10)

where%2 is the bivariate normal distribution function with correlation coefficientρ, and%−1 is the inverse of a

univariate normal distribution function. As we shall see later, this is the copula function used in CreditMetrics.

Bivariate Mixture Copula Function We can form new copula function using existing copula functions.

If the two uniform random variables u and v are independent, we have a copula function C(u, v) = uv. If

the two random variables are perfect correlated we have the copula function C(u, v) = min(u, v). Mixing

the two copula functions by a mixing coefficient (ρ > 0) we obtain a new copula function as follows

14

Page 56: Selection of Research Material relating to RiskMetrics Group CDO Manager

C(u, v) = (1 − ρ)uv + ρmin(u, v), if ρ > 0.

If ρ ≤ 0 we have

C(u, v) = (1 + ρ)uv − ρ(u− 1 + v)&(u− 1 + v), if ρ ≤ 0,

where

&(x) = 1, if x ≥ 0

= 0, if x < 0.

5.3 Copula Function and Correlation Measurement

To compare different copula functions, we need to have a correlation measurement independent of marginal

distributions. The usual Pearson’s correlation coefficient, however, depends on the marginal distributions

(See Lehmann [1966]). Both Spearman’s Rho and Kendall’s Tau can be defined using a copula function only

as follows

ρs = 12∫∫

[C(u, v)− uv]dudv,

τ = 4∫∫

C(u, v)dC(u, v)− 1.

Comparisons between results using different copula functions should be based on either a common Spear-

man’s Rho or a Kendall’s Tau.

Further examination of copula functions can be found in a survey paper by Frees and Valdez [1988] and a

recent book by Nelsen [1999].

15

Page 57: Selection of Research Material relating to RiskMetrics Group CDO Manager

5.4 The Calibration of Default Correlation in Copula Function

Having chosen a copula function, we need to compute the pairwise correlation of survival times. Using the

CreditMetrics (Gupton et al. [1997]) asset correlation approach, we can obtain the default correlation of two

discrete events over one year period. As it happens, CreditMetrics uses the normal copula function in its

default correlation formula even though it does not use the concept of copula function explicitly.

First let us summarize how CreditMetrics calculates joint default probability of two creditsA andB. Suppose

the one year default probabilities for A and B are qA and qB . CreditMetrics would use the following steps

• Obtain ZA and ZB such that

qA = Pr[Z < ZA]

qB = Pr[Z < ZB]

where Z is a standard normal random variable

• If ρ is the asset correlation, the joint default probability for credit A and B is calculated as follows,

Pr[Z < ZA,Z < ZB] =ZA∫

−∞

ZB∫−∞

φ2(x, y|ρ)dxdy = %2(ZA,ZB, ρ) (11)

where φ2(x, y|ρ) is the standard bivariate normal density function with a correlation coefficient ρ, and

%2 is the bivariate accumulative normal distribution function.

If we use a bivariate normal copula function with a correlation parameter γ , and denote the survival times

for A and B as TA and TB , the joint default probability can be calculated as follows

Pr[TA < 1, TB < 1] = %2(%−1(FA(1)),%

−1(FB(1), γ ) (12)

where FA and FB are the distribution functions for the survival times TA and TB . If we notice that

16

Page 58: Selection of Research Material relating to RiskMetrics Group CDO Manager

qi = Pr[Ti < 1] = Fj(1) and Zi = %−1(qi) for i = A,B,

then we see that equation (12) and equation (11) give the same joint default probability over one year period

if ρ = γ .

We can conclude that CreditMetrics uses a bivariate normal copula function with the asset correlation as the

correlation parameter in the copula function. Thus, to generate survival times of two credit risks, we use

a bivariate normal copula function with correlation parameter equal to the CreditMetrics asset correlation.

We note that this correlation parameter is not the correlation coefficient between the two survival times. The

correlation coefficient between the survival times is much smaller than the asset correlation. Conveniently,

the marginal distribution of any subset of an n dimensional normal distribution is still a normal distribution.

Using asset correlations, we can construct high dimensional normal copula functions to model the credit

portfolio of any size.

6 Numerical Illustrations

This section gives some numerical examples to illustrate many of the points discussed above. Assume that

we have two credit risks, A and B, which have flat spread curves of 300 bps and 500 bps over LIBOR. These

spreads are usually given in the market as asset swap spreads. Using these spreads and a constant recovery

assumption of 50% we build two credit curves for the two credit risks. For details, see Li [1998]. The two

credit curves are given in Figures (2) and (3). These two curves will be used in the following numerical

illustrations.

6.1 Illustration 1. Default Correlation v.s. Length of Time Period

In this example, we study the relationship between the discrete default correlation (1) and the survival time

correlation (8). The survival time correlation is a much more general concept than the discrete default

17

Page 59: Selection of Research Material relating to RiskMetrics Group CDO Manager

correlation defined for two discrete default events at an arbitrary period of time, such as one year. Knowing

the former allows us to calculate the latter over any time interval in the future, but not vice versa.

Using two credit curves we can calculate all marginal default probabilities up to anytime t in the future, i.e.

t q0 = Pr[τ < t] = 1 − e−∫ t

0 h(s)ds,

where h(s) is the instantaneous default probability given by a credit curve. If we have the marginal default

probabilities t qA0 and t qB0 for both A and B, we can also obtain the joint probability of default over the time

interval [0, t] by a copula function C(u, v),

Pr[TA < t, TB < t] = C(tqA0 , tqB0 ).

Of course we need to specify a correlation parameter ρ in the copula function. We emphasize that knowing

ρ would allow us to calculate the survival time correlation between TA and TB .

We can now obtain the discrete default correlation coefficient ρt between the two discrete events that A and

B default over the time interval [0, t] based on the formula (1). Intuitively, the discrete default correlation ρt

should be an increasing function of t since the two underlying credits should have a higher tendency of joint

default over longer periods. Using the bivariate normal copula function (10) and ρ = 0.1 as an example we

obtain Figure (4).

From this graph we see explicitly that the discrete default correlation over time interval [0, t] is a function

of t . For example, this default correlation coefficient goes from 0.021 to 0.038 when t goes from six months

to twelve months. The increase slows down as t becomes large.

6.2 Illustration 2. Default Correlation and Credit Swap Valuation

The second example shows the impact of default correlation on credit swap pricing. Suppose that credit A

is the credit swap seller and credit B is the underlying reference asset. If we buy a default swap of 3 years

18

Page 60: Selection of Research Material relating to RiskMetrics Group CDO Manager

with a reference asset of credit B from a risk-free counterparty we should pay 500 bps since holding the

underlying asset and having a long position on the credit swap would create a riskless portfolio. But if we

buy the default swap from a risky counterparty how much we should pay depends on the credit quality of the

counterparty and the default correlation between the underlying reference asset and the counterparty.

Knowing only the discrete default correlation over one year we cannot value any credit swaps with a maturity

longer than one year. Figure (5) shows the impact of asset correlation (or implicitly default correlation) on the

credit swap premium. From the graph we see that the annualized premium decreases as the asset correlation

between the counterparty and the underlying reference asset increases. Even at zero default correlation the

credit swap has a value less than 500 bps since the counterparty is risky.

6.3 Illustration 3. Default Correlation and First-to-Default Valuation

The third example shows how to value a first-to-default contract. We assume we have a portfolio of n credits.

Let us assume that for each credit i in the portfolio we have constructed a credit curve or a hazard rate function

for its survival time Ti . The distribution function of Ti is Fi(t). Using a copula function C we also obtain

the joint distribution of the survival times as follows

F(t1, t2, · · · , tn) = C(F1(t1), F2(t2), · · · , Fn(tn)).

If we use normal copula function we have

F(t1, t2, · · · , tn) = %n(%−1(F1(t1)),%−1(F2(t2)), · · · ,%−1(Fn(tn)))

where %n is the n dimensional normal cumulative distribution function with correlation coefficient matrix

-.

To simulate correlated survival times we introduce another series of random variables Y1, Y2, · · ·Yn, such

that

Y1 = %−1(F1(T1)), Y2 = %−1(F2(T2)), · · · , Yn = %−1(Fn(Tn)). (13)

19

Page 61: Selection of Research Material relating to RiskMetrics Group CDO Manager

Then there is a one-to-one mapping between Y and T . Simulating {Ti |i = 1, 2, ..., n} is equivalent to

simulating {Yi |i = 1, 2, ..., n}. As shown in the previous section the correlation between the Y ′s is the asset

correlation of the underlying credits. Using CreditManager from RiskMetrics Group we can obtain the asset

correlation matrix -. We have the following simulation scheme

• Simulate Y1, Y2, · · ·Yn from an n-dimension normal distribution with correlation coefficient matrix-.

• Obtain T1, T2, · · · Tn using Ti = F−1i (N(Yi)), i = 1, 2, · · · , n.

With each simulation run we generate the survival times for all the credits in the portfolio. With this

information we can value any credit derivative structure written on the portfolio. We use a simple structure

for illustration. The contract is a two-year transaction which pays one dollar if the first default occurs during

the first two years.

We assume each credit has a constant hazard rate of h = 0.1 for 0 < t < +∞. From equation (7) we

know the density function for the survival time T is he−ht . This shows that the survival time is exponentially

distributed with mean 1/h. We also assume that every pair of credits in the portfolio has a constant asset

correlation σ 2.

Suppose we have a constant interest rate r = 0.1. If all the credits in the portfolio are independent, the hazard

rate of the minimum survival time T = min(T1, T2, · · · , Tn) is easily shown to be

hT = h1 + h2 + · · · + hn = nh.

If T < 2, the present value of the contract is 1 · e−r·T . The survival time for the first-to-default has a density

function f (t) = hT · e−hT t , so the value of the contract is given by

2To have a positive definite correlation matrix, the constant correlation coefficient has to satisfy the condition σ > − 1n−1 .

20

Page 62: Selection of Research Material relating to RiskMetrics Group CDO Manager

V =∫ 2

01 · e−rtf (t)dt

=∫ 2

01 · e−r thT · e−hT tdt (14)

= hT

r + hT(1 − e−2.0·(r+hT )) .

In the general case we use the Monte Carlo simulation approach and the normal copula function to obtain

the distribution of T . For each simulation run we have one scenario of default times t1, t2, · · · tn, from which

we have the first-to-default time simply as t = min(t1, t2, · · · tn).

Let us examine the impact of the asset correlation on the value of the first-to-default contract of 5-assets. If

σ = 0, the expected payoff function, based on equation (14), should give a value of 0.5823. Our simulation of

50,000 runs gives a value of 0.5830. If all 5 assets are perfectly correlated, then the first-to-default of 5-assets

should be the same as the first-to-default of 1-asset since any one default induces all others to default. In this

case the contract should worth 0.1648. Our simulation of 50,000 runs produces a result of 0.1638. Figure

(6) shows the relationship between the value of the contract and the constant asset correlation coefficient.

We see that the value of the contract decreases as the correlation increases. We also examine the impact of

correlation on the value of the first-to-default of 20 assets in Figure (6). As expected, the first-to-default of

5 assets has the same value of the first-to-default of 20 assets when the asset correlation approaches to 1.

7 Conclusion

This paper introduces a few standard technique used in survival analysis to study the problem of default

correlation. We first introduce a random variable called “the time-until-default” to characterize the default.

Then the default correlation between two credit risks is defined as the correlation coefficient between their

survival times. In practice we usually use market spread information to derive the distribution of survival

times. When it comes to credit portfolio studies we need to specify a joint distribution with given marginal

distributions. The problem cannot be solved uniquely. The copula function approach provides one way of

21

Page 63: Selection of Research Material relating to RiskMetrics Group CDO Manager

specifying a joint distribution with known marginals. The concept of copula functions, their basic properties

and some commonly used copula functions are introduced. The calibration of the correlation parameter used

in copula functions against some popular credit models is also studied. We have shown that CreditMetrics

essentially uses the normal copula function in its default correlation formula even though CreditMetrics does

not use the concept of copula functions explicitly. Finally we show some numerical examples to illustrate the

use of copula functions in the valuation of credit derivatives, such as credit default swaps and first-to-default

contracts.

References

[1] Bowers, N. L., JR., Gerber, H. U., Hickman, J. C., Jones, D. A., and Nesbitt, C. J., Actuarial Mathe-

matics, 2nd Edition, Schaumberg, Illinois, Society of Actuaries, (1997).

[2] Carty, L. and Lieberman, D. Historical Default Rates of Corporate Bond Issuers, 1920-1996, Moodys

Investors Service, January (1997).

[3] Cox, D. R. and Oakes, D. Analysis of Survival Data, Chapman and Hall, (1984).

[4] Dall’Aglio, G., Frechet Classes and Compatibility of Distribution Functions, Symp. Math., 9, (1972),

pp. 131-150.

[5] Delianedis, G. and R. Geske, Credit Risk and Risk Neutral Default Probabilities: Information about

Rating Migrations and Defaults, Working paper, The Anderson School at UCLA, (1998).

[6] Duffie, D. and Singleton, K. Modeling Term Structure of Defaultable Bonds, Working paper, Graduate

School of Business, Stanford University, (1997).

[7] Frechet, M. Sur les Tableaux de Correlation dont les Marges sont Donnees, Ann. Univ. Lyon, Sect. A 9,

(1951), pp. 53-77.

[8] Frees, E. W. and Valdez, E., 1998, Understanding Relationships Using Copulas, North American Actu-

arial Journal, (1998), Vol. 2, No. 1, pp. 1-25.

22

Page 64: Selection of Research Material relating to RiskMetrics Group CDO Manager

[9] Gupton, G. M., Finger, C. C., and Bhatia, M. CreditMetrics – Technical Document, NewYork: Morgan

Guaranty Trust Co., (1997).

[10] Lehmann, E. L. Some Concepts of Dependence, Annals of Mathematical Statistics, 37, (1966), pp.

1137-1153.

[11] Li, D. X., 1998, Constructing a credit curve, Credit Risk, A RISK Special report, (November 1998), pp.

40-44.

[12] Litterman, R. and Iben, T. Corporate BondValuation and the Term Structure of Credit Spreads, Financial

Analyst Journal, (1991), pp. 52-64.

[13] Lucas, D. Default Correlation and Credit Analysis, Journal of Fixed Income, Vol. 11, (March 1995),

pp. 76-87.

[14] Merton, R. C. On the Pricing of Corporate Debt: The Risk Structure of Interest Rates, Journal of

Finance, 29, pp. 449-470.

[15] Nelsen, R. An Introduction to Copulas, Springer-Verlag New York, Inc., 1999.

[16] Sklar, A., Random Variables, Joint Distribution Functions and Copulas, Kybernetika 9, (1973), pp.

449-460.

23

Page 65: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 1: Hazard Rate Function of B Grade Based on Moody’s Study (1997)

1 2 3 4 5 6

0.060

0.065

0.070

0.075

haza

rd r

ate

Years

Hazard Rate Function

24

Page 66: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 2: Credit Curve A

09/10/1998 09/10/2000 09/10/2002 09/10/2004 09/10/2006 09/10/2008 09/10/2010

0.0

45

0.0

50

0.0

55

0.0

60

0.0

65

0.0

70

0.0

75

Credit Curve A: Instantaneous Default Probability

Date

Haza

rd R

ate

(Spread = 300 bps, Recovery Rate = 50%)

09/10/1998 09/10/2000 09/10/2002 09/10/2004 09/10/2006 09/10/2008 09/10/2010

0.0

45

0.0

50

0.0

55

0.0

60

0.0

65

0.0

70

0.0

75

Credit Curve A: Instantaneous Default Probability

Date

Haza

rd R

ate

(Spread = 300 bps, Recovery Rate = 50%)

25

Page 67: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 3: Credit Curve B

09/10/1998 09/10/2000 09/10/2002 09/10/2004 09/10/2006 09/10/2008 09/10/2010

0.0

80.0

90.1

00.1

10.1

2

Credit Curve B: Instantaneous Default Probability

Date

Haza

rd R

ate

(Spread = 500 bps, Recovery Rate = 50%)

26

Page 68: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 4: The Discrete Default Correlation v.s. the Length of Time Interval

1 3 5 7 9

Length of Period (Years)

0.15

0.20

0.25

0.30

Dis

cre

te D

efa

ult

Co

rrela

tio

n

Discrete Default Correlation v. s. Length of Period

27

Page 69: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 5: Impact of Asset Correlation on the Value of Credit Swap

-1.0 -0.5 0.0 0.5 1.0

Asset Correlation

400

420

440

460

480

500

Defa

ult

Sw

ap

Valu

e

Asset Correlation v. s. Default Swap Premium

28

Page 70: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 6: The Value of First-to-Default v. s. Asset Correlation

0.1 0.3 0.5 0.7Asset Correlation

0.2

0.4

0.6

0.8

1.0

Fir

st-

to-D

efa

ult

Pre

miu

m

The Value of First-to-Default v. s. the Asset Correlation

5-Asset20-Asset

29

Page 71: Selection of Research Material relating to RiskMetrics Group CDO Manager

The Valuation of theith-to-Default BasketCredit Derivatives

David X. LiRiskMetrics Group

44 Wall StreetNew York, NY 10005Phone:212-981-7453Fax: 212-981-7402

Email: david.liriskmetrics.com

June 17, 1999

Abstract

This article provides a simple approach to the valuation of theith-to-default credit derivatives. We define the survival time of each credit as arandom variable and characterize the variable by a hazard rate function. Thedefault correlation is incorporated by using a statistical concept called “cop-ula function” which allows us to combine marginal distributions to form ajoint distribution with a given correlation structure. It is shown that the cor-relation parameters in the normal copula function can be interpreted as theasset correlation used by the CreditMetrics. In the simple case of a creditportfolio with the same marginal survival time distribution for each creditand a constant pairwise asset correlation we can value theith-to-default con-tracts analytically. In the general case a Monte Carlo simulation algorithmis provided. Numerical examples are given for illustration purpose.

1

Page 72: Selection of Research Material relating to RiskMetrics Group CDO Manager

1 Introduction

The rapidly growing credit derivative market has created a new set of financialinstruments which can be used to manage the most important dimension of financialrisk - credit risk. In addition to the standard credit derivative products, such as creditdefault swaps and total return swaps based upon a single underlying credit risk,many new products are now associated with a portfolio of credit risks. A typicalexample is the product with payment contingent upon the time and identity of thefirst or second-to-default in a given credit risk portfolio. Variations include theinstruments with payment contingent upon the cumulative loss before a given timein the future. The equity tranche of a CBO (Collateralized Bond Obligations) andCLO (Collateralized Loan Obligations) is yet another variation, where the holder ofequity tranche incurs the first loss. Deductible and stop-loss in insurance productscould also be incorporated into the basket credit derivatives structure. As morefinancial firms try to manage their credit risk at portfolio level and the CBO/CLOmarket continues to expand, the demand for basket credit derivative products willmost likely continue to grow.

Centering on the valuation of the credit derivatives written on a credit portfoliois the problem associated with the default correlation. The problem of defaultcorrelation even arises in the valuation of simple credit default swap with oneunderlying reference asset if the independence of default between the referenceasset and the default swap seller is not assumed. Surprising though it may seem,the default correlation has not been well defined and understood in finance. Ex-isting literature tends to define default correlation based on discrete events whichdichotomize according to survival or nonsurvival at a critical period such as oneyear. For example, if we denote

qA = Pr[E1], q2 = Pr[E2], q12 = Pr[E1E2]whereE1, E2 are defined as the default event of credit risk 1 and 2 over one year.Then the default correlationρ12 between the two default eventsE1 andE2, basedon the standard definition of correlation of two random variables, are defined asfollows

ρ12 = q12 − q1 · q2√q1(1 − q1)q2(1 − q2)

, (1)

2

Page 73: Selection of Research Material relating to RiskMetrics Group CDO Manager

where Pr[E1E2] is the probability that both eventsE1 andE2 occur over one year.This discrete event approach has been taken by Lucas (1995) and Gupton et al.(1997).

However the choice of a specific period like one year is more or less arbitrary.It may correspond with many empirical studies of default rate over one year period.But the dependence of default correlation on a specific time interval has its disad-vantages. First, default event is a time dependent event, so is default correlation.Let us take the survival time of a human being as an example. The probabilityof dying within one year for a person aged 50 years today is about 0.6%, but theprobability of dying for the same person within 50 year is almost a sure event.Similarly default correlation is a time dependent quantity. Let us now take thesurvival times of a couple, both aged 50 today. The correlation between the twodiscrete events that each dies within one year should be very small. But the cor-relation between the two discrete events that each dies within 100 years shouldbe 1. Second, concentrating on a single period of survival curve wastes someimportant information. There are empirical studies which show that the defaulttendency of corporate bonds is linked to the survival time. Also there are stronglinks between the economic cycle and defaults. Arbitrarily focusing on one yearperiod would neglect these important information. Third, in the majority of creditderivative valuation what we need is not the default correlation of two entities overnext one year. We usually need to have a joint distribution of survival times for thematurity of the transactions, and the maturity is not necessarily one year. Fourth,the estimation of default correlation based on discrete events over one year hassome shortcomings. Some delicate issues, such as censoring1 cannot be treatedproperly.

This paper introduces some techniques used in survival analysis which hasbeen widely applied to other areas, such as life contingencies in actuarial science,industry life testing in reliability studies, which share a lot of similarities with thecredit problems we encounter here. We first introduce a random variable called“time-until-default” to denote the survival time of each defaultable entity or fi-nancial instrument. Then, we define the default correlation of two entities as thecorrelation between their survival times. In credit derivative valuation we need first

1A company who is observed, default free, by Moody’s for 5-years and then withdraws fromthe Moody’s study must have a survival time exceeding 5 years. Another company may enter intoMoody’s study in the middle of a year, which implies that Moody’s observes the company foronly half of the one year observation period. In the survival analysis of statistics, such incompleteobservation of default time is calledcensoring. According to Moody’s studies, such incompleteobservation does occur in Moody’s credit default samples.

3

Page 74: Selection of Research Material relating to RiskMetrics Group CDO Manager

to construct a credit curve for each credit risk. A credit curve gives all marginalconditional default probabilities over a number of years in the future. This curveis usually derived from the market observed spread curve of risky bonds or assetswaps. The spread curve of risky bonds or asset swaps contains information ondefault probabilities, recovery rate and liquidity factors etc. Assuming an exoge-nous recovery rate and a default treatment, we can extract a credit curve from thespread curve. For two credit risks, we would obtain two credit curves from marketobservable information on the spread curve of the two credit risks. Then, we needto specify a joint distribution for the survival times of the two credit risks such thatthe marginal distribution of survival time for each credit risk is the derived creditcurve. Obviously, this problem has no unique solution. Copula functions usedin multivariate statistics provide a convenient way to specify a joint distributionwith given marginal distributions. The basic concept of the copula functions andtheir properties are discussed. The correlation parameters used in normal copulafunctions can be shown to be the asset correlation used in CreditMetrics, whichprovides us a convenient way to calibrate the correlation parameters in the copulafunctions.

The paper is organized as follows. In Section 2 we briefly describe each elementof the building blocks of our model. Then in Section 3 we study the valuation ofthe ith-to-default contract when the normal copula function is used. We firstoutline a Monte Carlo simulation algorithm for the general case. In the specialcase of a credit portfolio with the same marginal term structure of default ratesand a constant positive pairwise asset correlation we show how to value theith-to-default contract analytically based on the distribution of the order statistics of themultivariate normal distribution. In Section 4 we give some numerical exampleson the valuation of the first, second and third -to-default contracts to illustrate thecomputational aspect of our model. Section 5 concludes the paper.

2 Model Setup

In this section we present the general setup and building blocks of our model.

2.1 The Characterization of Default

We consider a portfolio ofn credit risks. A credit risk is typically a financialinstrument, such as a corporate bond or a corporation, which could fail to keepits promise. We assume that each credit is characterized by its survival time,T ,

4

Page 75: Selection of Research Material relating to RiskMetrics Group CDO Manager

which measures the time from today to the time of default. The characterizationof this random variable can be described by its density functionf (t), distributionfunctionF(t), survival functionS(t) or hazard rate functionh(t). The hazard ratefunction

h(t) = f (t)

1 − F(t)

= lim1t→0

F(t + 1t) − F(t)

1t(1 − F(t))

= lim1t→0

Pr[t < T ≤ t + 1t |T > t]

has a conditional probability density interpolation, which gives the value of theconditional probability density function ofT at exact timet , given survival tothat time. Let’s denote it ash(t), which is usually defined as thehazard ratefunction, h(t). The relationship of hazard rate function with distribution functionand survival function is as follows

h(t) = f (t)

1 − F(t)(2)

= −S′(t)S(t)

.

Then, the survival function can be expressed in terms of the hazard rate function,

S(t) = e− ∫ t0 h(s)ds.

With a hazard rate function we can calculate any default probabilities. To makeprobability statements about a security which has survivedx years, the future lifetime for this security isT − x|T > x. We introduce two more notations

t qx = Pr[T − x ≤ t |T > x] t ≥ 0

tpx = 1 − t qx = Pr[T − x > t |T > x] t ≥ 0. (3)

The symbolt qx can be interpreted as the conditional probability that the securityA will default within the nextt years conditional on the survival inx years. Ift = 1,we use the actuarial convention to omit the prefix 1 in the symbolst qx andtpx .

5

Page 76: Selection of Research Material relating to RiskMetrics Group CDO Manager

A typical assumption is that the hazard rate is a constant,h, over certain period,such as[x, x + 1]. In this case, the density function is

f (t) = he−ht

which shows that the survival time follows an exponential distribution with pa-rameterh. Under this assumption, the survival probability over the time interval[x, x + t] for 0 < t ≤ 1 is

tpx = 1 − t qx

= e− ∫ t0 h(s)ds (4)

= e−ht

= (px)t

wherepx is the probability of survival over one year period. This assumption canbe used to scale down the default probability over one year to a default probabilityover a time interval less than one year.

Modeling a default process is equivalent to modeling a hazard function. Thereare a number of reasons why modeling the hazard rate function may be a good idea.First, it provides us the useful information of immediate default risk of each entityknown to be alive at each aget . Second, the comparisons of groups of individualsare most incisively made via the hazard rate function. Third, hazard rate functionbased model can be more easily adapted to more complicated situations, such aswhen there is censoring or there are several types of default or we’d like to considerstochastic default fluctuations. Fourth, there are a lot of similarities between thehazard rate function and the short rate. Many modeling techniques from the shortrate processes can be readily borrowed to model the hazard rate.

2.2 Definition of Default Correlations

The default correlation of two entitiesA andB is defined with respect to theirsurvival timesTA andTB as follows

ρAB = Cov(TA, Tb)√V ar(TA)V ar(TB)

(5)

= E(TATB) − E(TA)E(TB)√V ar(TA)V ar(TB)

. (6)

6

Page 77: Selection of Research Material relating to RiskMetrics Group CDO Manager

This definition of default correlation based on survival time is much more generalthan that based on two discrete events. If we have the joint distribution of twosurvival timesf (s, t) we can calculate default correlation of two discrete eventsover any length of time interval. For example, if we define

E1 = [TA < 1]E2 = [TB < 1]

then the default correlation for these two discrete default events over one yearhorizon can be calculated using equation (1) with the following substitutions

q12 =∫ 1

0

∫ 1

0f (s, t)dsdt

q1 =∫ 1

0fA(s)ds

q2 =∫ 1

0fB(t)dt.

However, knowing the default correlation of two discrete events over one yearperiod does not allow us to specify the default correlation of two survival times.

2.3 The Construction of Credit Curve

The characterization of the distribution of survival time can also be called a creditcurve due to the similarity of a yield curve and a hazard rate function. But the basicquestion here is: how do we obtain the credit curve or the distribution of survivaltime for a given credit?

There exist three methods to obtain the term structure of default rates:

1. Obtaining historical default information from rating agencies;

2. Taking the Merton (1974) option theoretical approach and its extensions;

3. Taking the implied approach using market price of defaultable bonds or assetswap spreads.

7

Page 78: Selection of Research Material relating to RiskMetrics Group CDO Manager

Rating agencies like Moody’s publish historical default rate studies regularly.In addition to the commonly cited one-year default rates, they also present multi-year default rates. From these rates we can obtain the hazard rate function forthe survival time. For example, Moody’s publishes weighted average cumulativedefault rates from 1 to 20 years. From the study by Carty and Lieberman (1997)we find that the first 5 years cumulative default rates forB rating in percentage are7.27, 13.87, 19.94, 25.03 and 29.45. From these rates we can derive the marginalconditional default probabilities: The first marginal conditional default probabilityin year one is simply the one-year default probability, 7.27%. The other marginalconditional default probabilities can be obtained using the following formula:

n+1qx = nqx + npx · qx+n, (7)

which simply states that the probability of default over time interval[x, x +n+1]is the sum of the probability of default over the time interval[x, x + n], plus theprobability of survival to the end ofnth year, and then default during the followingone year. Using equation (7) we have the marginal conditional default probabilityas follows

qn+1 = n+1qx − nqx

1 − nqx

,

which gives us the marginal conditional default probabilities in year 2, 3, 4, 5 as7.12%, 7.05%, 6.36% and 5.90%. Using an assumption of a piecewise constanthazard rate function over one year we can translate these marginal default prob-abilities into a hazard rate function using equation (4). The result is plotted inFigure 1.

Using diffusion processes to describe changes in the value of the firm, Merton(1974) demonstrated that a firm’s default could be modeled with the Black andScholes methodology. He showed that stock could be considered as a call optionon the firm with strike price equal to the face value of a single payment debtissue. Using this framework we can obtain the default probability for the firmover one period, from which we can translate this default probability into a hazardrate function. Geske (1977) and Delianedis and Geske (1998) extended Merton’sanalysis to produce a term structure of default probabilities. Using the relationshipbetween the hazard rate and the default probabilities we can obtain a credit curve.

Alternatively, we can take the implicit approach by using market observableinformation, such as the asset swap spreads or risky corporate bond prices. This is

8

Page 79: Selection of Research Material relating to RiskMetrics Group CDO Manager

the approach used by most credit derivative trading desks. The extracted defaultprobabilities reflect the market-agreed perception today about the future defaulttendencies of the underlying credit. Li (1998) presented one approach to buildingthe credit curve from market information based on Duffie and Singleton (1997)default treatment. In that paper the author assumes that there exists a series ofbonds with maturity 1, 2, .., n years, which are issued by the same company andhave the same seniority. All the information used there is market observable. Fromthe market price of these bonds we can calculate their yields to maturity. Using theyield to maturity of corresponding treasury bonds we obtain a yield spread curveover treasury (or the asset swap spreads for a yield spread curve over LIBOR).The credit curve construction is based on this yield spread curve and an exogenousassumption about the recovery rate based on the seniority and the rating of thebonds, and the industry of the corporation. Details can be found in that paper.Hereafter we assume we have constructed a credit curve for each credit risk westudy.

2.4 Correlating Survival Times Using Copula Functions

Let us study some problems of ann credit portfolio. Using the historical approach,the Merton option theoretical approach or the market implicit approach, we canconstruct the marginal distribution of survival time for each of the credit risksin the portfolio. If we assume mutual independence among the credit risks, wecan study any problem associated with the portfolio. However, the independenceassumption of the credit risks is obviously not realistic; in reality, the default rateof a group of credit risks tends to be higher when the economy is in recessionand lower when the economy is booming. This implies that each credit is subjectto the same set of macroeconomic environment, and that there exists some formof positive dependence of default among the credit risks. In order to introduce acorrelation structure into the portfolio, we must determine how to specify a jointdistribution of survival times with given marginal distributions.

Obviously, this problem has no unique solution. Generally speaking, knowingthe joint distribution of random variables allows us to derive the marginal distribu-tions and the correlation structure among the random variables, but not vice versa.There are many different techniques in statistics which allow us to specify a jointdistribution function with given marginal distributions and a correlation structure.Among them, copula function is a simple and convenient approach. We give abrief introduction to the concept of copula function as follows.

A copula function is a function that links or marries univariate marginals to their

9

Page 80: Selection of Research Material relating to RiskMetrics Group CDO Manager

full multivariate distribution. Form uniform random variables,U1, U2, · · · ,Um,the joint distribution functionC, defined as

C(u1, u2, · · · , um, ρ) = Pr[U1 ≤ u1, U2 ≤ u2, · · · , Um ≤ um]is also called acopula function.

Copula function can be used to link marginal distributions to form a joint dis-tribution. For given univariate marginal distribution functionsF1(x1), F2(x2),· · · ,Fm(xm), the function

C(F1(x1), F2(x2), · · · , Fm(xm)) = F(x1, x2, · · · xm),

which is defined using a copula functionC, results in a multivariate distributionfunction with univariate marginal distributions as specifiedF1(x1), F2(x2),· · · ,Fm(xm).

This property can be easily shown as follows:

C(F1(x1), F2(x2), · · · , Fm(xm), ρ)

= Pr[U1 ≤ F1(x1), U2 ≤ F2(x2), · · · , Um ≤ F(xm)]

= Pr[F−1

1 (U1) ≤ x1, F−12 (U2) ≤ x2, · · · , F−1

m (Um) ≤ xm

]= Pr[X1 ≤ x1, X2 ≤ x2, · · · , Xm ≤ xm]

= F(x1, x2, · · · xm).

The marginal distribution ofXi is

C(F1(+∞), F2(+∞), · · ·Fi(xi), · · · , Fm(+∞), ρ)

= Pr[X1 ≤ +∞, X2 ≤ +∞, · · · , Xi ≤ xi, Xm ≤ +∞]

= Pr[Xi ≤ xi]= Fi(xi).

Sklar (1959) established the converse. He showed that any multivariate dis-tribution functionF can be written in the form of a copula function. He provedthe following: IfF(x1, x2, · · · xm) is a joint multivariate distribution function withunivariate marginal distribution functionsF1(x1), F2(x2),· · · , Fm(xm), then thereexists a copula functionC(u1, u2, · · · , um) such that

F(x1, x2, · · · xm) = C(F1(x1), F2(x2), · · · , Fm(xm)).

10

Page 81: Selection of Research Material relating to RiskMetrics Group CDO Manager

If eachFi is continuous thenC is unique. Thus, copula functions provide a unifyingand flexible way to study multivariate distributions.

A common copula function we use in the multivariate normal copula function

C(u1, u2, · · · , um) = 8m(8−1(u1), 8−1(u2), · · · , 8−1(um), 6) (8)

where8m is them dimensional normal distribution function with correlation co-efficient matrix6, 8 is the univariate normal distribution function, and8−1 is theinverse of a univariate normal distribution function.

To compare different copula functions, we need to have a correlation mea-surement independent of marginal distributions. The usual Pearson’s correlationcoefficient, however, depends on the marginal distributions (Lehmann (1966)).Both Spearman’s Rho and Kendall’s Tau can be defined using a copula functiononly as follows

ρs = 12∫

[C(u, v) − uv]dudv,

τ = 4∫

C(u, v)dC(u, v) − 1.

Comparison between results using different copula functions should be based oneither a common Spearman’s Rho or on Kendall’s Tau.

Further examination of copula functions can be found in a survey paper byFrees and Valdez (1998) and a recent book by Nelsen (1999).

2.5 Calibrating Correlation Parameters in Copula Functions

Having chosen a copula function, we need to have the pairwise correlation of sur-vival times for a credit portfolio. Currently there exist two approaches to the defaultcorrelation. The first approach derives default correlation from the variation of de-fault rates across time. This is the approach used in the CreditRisk+ by CreditSuiss Financial Products (1997), which has been used in the actuarial science forinsurance pricing. The second approach uses the asset correlation to quantify thedefault correlation, such as the CreditMetrics. Using CreditMetrics asset correla-tion approach, we can obtain the default correlation of two discrete events over one

11

Page 82: Selection of Research Material relating to RiskMetrics Group CDO Manager

year period based on the definition (1). Using a bivariate copula function with acorrelation parameter we can also calculate the default correlation for the discreteevents over one year. By equating the two results we can calibrate the correlationparameter in the copula function. As it happens, CreditMetrics implicitly usesthe normal copula function in its default correlation formula and the correlationparameter in the copula function can be shown to be the asset correlation.

To see this point we first summarize how CreditMetrics calculates joint defaultprobability of two creditsA andB. Suppose the one year default probabilities forA andB areqA andqB . CreditMetrics would use the following steps

• ObtainZA andZB such that

qA = Pr[Z < ZA]qB = Pr[Z < ZB]

whereZ is a standard normal random variable.

• If ρ is the asset correlation, the joint default probability for creditA andB

is calculated as follows,

Pr[Z < ZA, Z < ZB] =ZA∫

−∞

ZB∫−∞

φ2(x, y|ρ)dxdy = 82(ZA, ZB, ρ) (9)

whereφ2(x, y|ρ) is the standard bivariate normal density function with acorrelation coefficientρ.

If we use a bivariate normal copula function with a correlation parameterγ ,and denote the survival times forA andB asTA andTB , the joint default probabilityover one year period can be calculated as follows

Pr[TA < 1, TB < 1] = 82(8−1(FA(1)), 8−1(FB(1), γ ) (10)

whereFA andFB are the distribution functions for the survival timesTA andTB ,and8 is the univariate normal distribution function. If we notice that

qi = Pr[Ti < 1] = Fj(1) andZi = 8−1(qi) for i = A, B.

12

Page 83: Selection of Research Material relating to RiskMetrics Group CDO Manager

then we see that equation (10) and equation (9) give the same joint default proba-bility over one year period ifρ = γ .

We can conclude that CreditMetrics uses a bivariate normal copula functionwith the asset correlation as the correlation parameter in the copula function. Thus,to generate survival times of two credit risks, we use a bivariate normal copulafunction with a correlation parameter equal to the CreditMetrics asset correlation.We note that this correlation parameter is not the correlation coefficient betweenthe two survival times. The correlation coefficient between the two survival timesis much smaller than the asset correlation. Conveniently, the marginal distributionof any subset of ann dimension normal distribution is still a normal distribution.Using asset correlation, we can construct high dimension normal copula functionsto model the joint distribution of survival times of a credit portfolio.

3 The Valuation of the ith-to-Default Contract

In this section we first provide a general Monte Carlo simulation approach to thevaluation of theith-to-default contract. Then we show in some special cases thecontract can be valued analytically.

3.1 General Simulation Approach

We assume we have a portfolio ofn credits. Let us assume that for each crediti

in the portfolio we have constructed a credit curve or a hazard rate function for itssurvival timeTi . The distribution function ofTi is Fi(t). Using a copula functionC we also obtain the joint distribution of the survival times as follows

F(t1, t2, · · · , tn) = C(F1(t1), F2(t2), · · · , Fn(tn)).

If we use normal copula function we have

F(t1, t2, · · · , tn) = 8n(8−1(F1(t1)), 8

−1(F2(t2)), · · · , 8−1(Fn(tn)))

where8n is then-dimension accumulative distribution function with correlationcoefficient matrixρ.

To simulate correlated survival times we introduce another series of randomvariablesY1, Y2, · · ·Yn, such that

13

Page 84: Selection of Research Material relating to RiskMetrics Group CDO Manager

Y1 = 8−1(F1(T1)), 8−1(F2(T2)), · · · , Yn = 8−1(Fn(Tn)).

Then there is a one-to-one mapping betweenT1, T2, · · · Tn andY1, Y2, · · ·Yn. Sim-ulating {Ti |i = 1, 2, ..., n} is equivalent to simulating{Yi |i = 1, 2, ..., n}. Asshown in the previous section the correlation between anyY ′s is the asset correla-tion of the underlying credits. Using CreditMetrics approach to asset correlationwe can obtain the asset correlation matrix asρ. We have the following simulationscheme

• SimulateY1, Y2, · · ·Yn from an n-dimension normal distribution with corre-lation coefficient matrixρ.

• ObtainT1, T2, · · · Tn usingTi = F−1i (8(Yi)), i = 1, 2, · · · , n.

With each simulation run we have full information of survival times for all thecredits in the portfolio. With this information we can sort the survival times toobtain theith-to-default survival time and the identity of theith-to-default. Anycredit derivative structures based on theith-to-default of a credit portfolio can bereadily valued. This framework can also be used to value any credit derivativesbased on the evolution of loss function over a certain period of time.

3.2 Analytic Results in Special Cases

The simulation algorithm outlined above can be applied to the general case withany asset correlation structure and any marginal distribution of survival times forall credits in a portfolio. However, in some special cases we can value theith-to-default contract analytically. We discuss two cases here as follows

3.2.1 The First-to-Default Contract in the Case of Independence of SurvivalTimes

If all the credits in the portfolio are independent, the hazard rate of the minimumsurvival timeT = min(T1, T2, · · · , Tn) is easily shown to be

hT (t) = h1(t) + h2(t) + · · · + hn(t).

14

Page 85: Selection of Research Material relating to RiskMetrics Group CDO Manager

Then, the survival time for the first-to-default has a density functionf (t) = hT ·e−hT t . In this case we can value any contract whose payoff function depends onthe survival time of the first-to-default. For example, if we have a contract whichpays $1 when first default of ann credit portfolio occurs before a maturity termM, we can value the contract as follows

V =∫ M

01 · e−rtf (t)dt

=∫ M

01 · e−r thT · e−hT tdt,

wherer is a constant interest rate. IfhT (t) is a constanthT , the above expressioncan be simplified as

V = hT

r + hT

(1 − e−M·(r+hT )

). (11)

3.2.2 Theith-to-Default of a Credit Portfolio with a Constant Pairwise AssetCorrelation and a Common Marginal Survival Time

In this case, all the credit risks in the portfolio have the same marginal distributionof survival time and a constant pairwise asset correlationρ. It turns out that we canvalue theith-to-default contract analytically if a normal copula function is used.

In the valuation of theith-to-default contract we need to study the distributionof the order statisticT(1) ≤ T(2) ≤ · · · ≤ T(n) of the survival timesT1, T2, · · · , Tn.Let us assume that each credit’s survival time follow the same distribution whosedistribution function is denoted asF(t), then we have

Ti = F−1(8(Yi)) (12)

whereY1, Y2, · · · , Yn follow multivariate normal distribution with a constant pair-wise correlationρ. Since we have the same marginal distributionF for all survivaltimes, translatingY1, Y2, · · · , Yn into T1, T2, · · · , Tn using equation (12) does notchange the order, then we have

T(i) = F−1(8(Y(i))), for i = 1, 2, · · · , n.

15

Page 86: Selection of Research Material relating to RiskMetrics Group CDO Manager

If we know the distribution ofY(i), we can value any payoff function based onsurvival timeT(i). It can be shown (See Appendix) that the density function forY(i) is

g(i)(x) = 1√2 · π

· n!(i − 1)!(n − i)! · e− x2

2

∞∫−∞

8i−1(√

1 − ρ · x + √2ρ · u

8n−i(−√

1 − ρ · x − √2ρ · u

)e−u2

du.

Suppose now that we need to value anith-to-default contract which pays $1 atthe time when theith-to-default occurs before the maturity term ofM. For simplic-ity we assume that the common survival time follows an exponential distributionwith a parameterh, i.e., the distribution function is

F(x) = 1 − e−hx.

We also assume that interest rate is a constantr, the present value of this contractcan be written as

V = E(e−rT(i)|T (i) < M)

= E(e−r·F−1(8(Y(i)))|Y(i) < 8−1(F (M)))

=8−1(F (M))∫

−∞(1 − 8(x))

rh g(i)(x)dx

= 1√2 · π

· n!(i − 1)!(n − i)!

8−1(F (M))∫−∞

(1 − 8(x))rh · e− x2

2 · (13)

∞∫−∞

8i−1(√

1 − ρ · x + √2ρ · u

)8n−i

(−√

1 − ρ · x − √2ρ · u

)e−u2

du

dx.

This expression can be valued accurately using Guaussian quadratures. The in-side integration can be calculated using Gauss-Hermite quadratures and the outsideusing Gauss-Legendre quadratures.

16

Page 87: Selection of Research Material relating to RiskMetrics Group CDO Manager

Table 1: The Basic Set of Parameters

Parameters ValueTerm of the Contract 2 YearInterest Rate 10%Hazard rate 10%Correlation 0.25Number of Asset 5

4 Numerical Examples

In this section we produce some numerical examples to illustrate how our methodcan be used to value a specific contract. We assume that the contract is a two-yeartransaction which pays one dollar if theith-to-default of ann credit portfolio occursduring the first two years. We also assume a constant interest rate of 10%, and a flatterm structure of default rates of 10%, and a constant pair-wise asset correlationof 0.25. The basic set of parameters is summarized in Table (1). If all the creditsin the portfolio are independent, the hazard rate of the minimum survival time ishT = n ·h, and the contract value can be simply calculated using equation (11). Inother general cases the contract value is given by equation (13), which is evaluatedusing Guaussian quadratures.

First, we examine the impact of the number of assets on the value of the first,second and third-to-default contracts. If there is only one asset, the value of thefirst-to-default contract, based on equation (11), should be 0.1648. As the numberof assets increases, the chance that there is one default within the two years alsogrows, as does the value of the first-to-default contract. For the same reason, thevalue of the second and the third to default also increases as the number of assetsin the portfolio increases. Figure 2 shows how the value of the first, the second andthe third-to-default changes along with the number of the assets in the portfolio.We see that the value of these contracts increases at a decreasing rate. When thenumber of assets increases to 50, the value of the first-to-default becomes 0.9141,the second-to-default 0.8339 and the third-to-default 0.7561. Similar observationcan be made in Figure 3 when the asset correlation is 0.50. The values used inFigure 1 and 2 are given in Table 2 and 3.

Second, we examine the relative value of the first-to-default, second-to-defaultand third-to-default contracts. Obviously as we go up the order of the default the

17

Page 88: Selection of Research Material relating to RiskMetrics Group CDO Manager

contract should have a smaller value. But as the number of the underlying creditsin the portfolio increases the difference between these contract values should alsodecrease. Figure 2 plots the value of theith-to-default against the number of theunderlying credit risks. When they are only three assets, the first-to-default has avalue of 0.3731, the second-to-default has a value of 0.1042 and the third-to-defaulthas a value of 0.0172. The value of the first-to-default is 3.6 times of that of thesecond-to-default and 208 times of that of the third-to-default. When the numberof assets increases to 50, the ratios become 1.1 and 1.45, which is consistent withour expectation.

Third, we examine the impact of the correlation on the value of the first-to-default contract of 5 assets. If the asset correlation is 0, the expected payofffunction, based on equation (11), should give a value of 0.5823. If all 5 assets areperfectly correlated, then the first-to-default of 5 assets should have the same valueas the first-to-default of 1 asset since any one default induces all others to default.In this case the contract should worth 0.1648. Figure 4 shows the relationshipbetween the value of the contract and the constant correlation coefficient. Wesee that the value of the contract decreases as the correlation increases. We alsoexamine the impact of correlation on the value of the first-to-default of 20 assetsin Figure 4. As expected, the first-to-default of 5 assets has the same value of thefirst-to-default of 20 assets when correlation approaches 1.

We can also use these special cases to examine the accuracy of the simulationalgorithm for general cases we outlined in the last section. It turns out with 50, 000runs we could get a result accurate to the second decimal points of the analyticalresult we examine here.

5 Conclusions

This paper presents a simple approach to the valuation of theith-to-default creditderivatives. The default event of each credit risk is described by its survival timewhich measures the time from today to the time of default. The marginal distri-bution of survival time for each credit is characterized by a hazard rate function,or a credit curve. The hazard rate function can be obtained using either a his-torical data or by an implied approach using market observable information onrisky bond prices or asset swap spreads. The joint distribution of survival timesof ann credit portfolio can be built using a copula function. It has also be shownthat the correlation parameter in a normal copula function can be interpreted asthe asset correlation used in the CreditMetrics. Under this framework we can es-

18

Page 89: Selection of Research Material relating to RiskMetrics Group CDO Manager

sentially value any credit derivatives written on the credit portfolio as well as thenth-to-default contract.

In the general case we outline a Monte Carlo simulation algorithm for the val-uation. In the special case of a credit portfolio with the same marginal distributionof survival times and a constant positive pair-wise asset correlation we can valuetheith-to-default contract analytically using a normal copula function. The modelhas been shown to produce expected results under a variety of circumstances.

References

[1] Carty, L. and D. Lieberman. Historical Default Rates of Corporate BondIssuers, 1920-1996, Moody’s Investors Service, January (1997).

[2] Credit Suisse Financial Products. CreditRisk + - A Credit Risk ManagementFramework,(1997).

[3] Delianedis, G. and R. Geske, Credit Risk and Risk Neutral Default Proba-bilities: Information about Rating Migrations and Defaults, Working paper,The Anderson School at UCLA, (1998).

[4] Duffie, D. and K. Singleton, Modeling Term Structure of Defaultable Bonds,Working paper, Graduate School of Business, Stanford University, (1997).

[5] Frechet, M. Sur les Tableaux de Correlation dont les Marges sont Donnees,Ann. Univ. Lyon, Sect. A 9, (1951), pp. 53-77.

[6] Frees, E. W. and Valdez, E., 1998, Understanding Relationships Using Cop-ulas,North American Actuarial Journal, (1998), Vol. 2, No. 1, pp. 1-25.

[7] Geske, R. TheValuation of Corporate Liabilities as Compound Options,Jour-nal of Financial and Quantitative Analysis, pp. 541-552, (1977).

[8] Gupton, G. M. and C. Finger, and M. Bhatia. CreditMetrics – TechnicalDocument, New York: Morgan Guaranty Trust Co., (1997).

[9] Lehmann, E. L. Some Concepts of Dependence, Annals of MathematicalStatistics, 37, (1966), pp. 1137-1153.

[10] Li, D. X. Constructing a credit curve, Credit Risk, A RISK Special report,(November 1998), pp. 40-44.

19

Page 90: Selection of Research Material relating to RiskMetrics Group CDO Manager

[11] Lucas, D. Default Correlation and Credit Analysis,Journal of Fixed Income,Vol. 11, (March 1995), pp. 76-87.

[12] Merton, R. C. On the Pricing of Corporate Debt: The Risk Structure ofInterest Rates,Journal of Finance, 29, pp. 449-470.

[13] Nelsen, R.An Introduction to Copulas, Springer-Verlag NewYork, Inc., 1999.

[14] Sklar, A., Random Variables, Joint Distribution Functions and Copulas, Ky-bernetika 9, (1973), pp. 449-460.

[15] Tong, H. The Multivariate Normal Distribution, Springer-Verlag New York,Inc., 1990.

20

Page 91: Selection of Research Material relating to RiskMetrics Group CDO Manager

Appendix: Order Statistic of Correlated Normal Dis-tribution

Let Z1, Z2, · · · , Zn denote identically distributed standard normalN(0, 1)

variables and letZ(1) ≤ Z(2) ≤ · · · ≤ Z(n) be the corresponding order statistics.Letφ and8 denote, respectively, the standard normal distribution density functionand distribution function. Then we have (See Tong (198?)

• For every fixed 1≤ i ≤ n, if Z1, Z2, · · · , Zn are independent, the marginaldensity function and distribution function ofZ(i) are, respectively,

f(i)(z) = n!(i − 1)!(n − i)!8

i−1(z)8n−i(−z)φ(z), (14)

F(i)(z) =n∑

t=i

(n

t

)8t(z)8n−i(−z).

• If Z1, Z2, · · · , Zn are correlated, but with constant correlation coefficient0 ≤ ρ < 1, the density function and distribution function ofZ(i) (1 ≤ i ≤ n)are, respectively

g(i)(x) = 1√1 − ρ

∞∫−∞

f(i)

(x + √

ρz√1 − ρ

)φ(z)dz, (15)

G(i)(x) =∞∫

−∞F(i)

(x + √

ρz√1 − ρ

)φ(z)dz,

wheref(i) andF(i) are given in equation (14). Then the density function forZ(i) can be written as

In order to use Gaussian quadratures we rewrite the density functiong(i)(x) as

21

Page 92: Selection of Research Material relating to RiskMetrics Group CDO Manager

follows

g(i)(x) = 1√1 − ρ

n!(i − 1)!(n − i)!

∞∫−∞

8i−1(

x + √ρz√

1 − ρ

)8n−i

(−x + √

ρz√1 − ρ

(x + √

ρz√1 − ρ

)φ(z)dz

= 1

2π√

1 − ρ· n!(i − 1)!(n − i)! · e− x2

2

∞∫−∞

8i−1(

x + √ρz√

1 − ρ

8n−i

(−x + √

ρz√1 − ρ

)e−

(z+√

ρx√2·(1−ρ)

)2

dz

= 1√2 · π

· n!(i − 1)!(n − i)! · e− x2

2

∞∫−∞

8i−1(√

1 − ρ · x + √2ρ · u

8n−i(−√

1 − ρ · x − √2ρ · u

)e−u2

du

For i = 1 we can obtain the density function for the smallest variableZ(1) asfollows

g1(x) = n√2 · π

· e− x22 ·

∞∫−∞

[8

(−√

1 − ρ · x − √2ρ · u

)]n−1e−udu.

Similarly, the density function for the second smallest variableZ(2) as follows

g2(x) = n(n − 1)√2 · π

· e− x22

∞∫−∞

8(√

1 − ρ · x + √2ρ · u

)8n−2

(−√

1 − ρ · x − √2ρ · u

)e−u2

du

22

Page 93: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 1:

1 2 3 4 5 6

0.0

65

0.0

70

0.0

75

The Hazard Rate Function for B Rating

Years

23

Page 94: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 2:

10 20 30 40 50

0.0

0.2

0.4

0.6

0.8

The Value of the First, Second and Third-to-Default

The Number of Assets

First-to-DefaultSecond-to-DefaultThird-to-Default

24

Page 95: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 3:

10 20 30 40 50

0.2

0.4

0.6

The Value of the First, Second and Third-to-Default

The Number of Assets

First-to-DefaultSecond-to-DefaultThird-to-Default

25

Page 96: Selection of Research Material relating to RiskMetrics Group CDO Manager

Figure 4: The Value of the First-to-Default as a Function of Asset Correlation

0.0 0.2 0.4 0.6 0.8 1.0

0.2

0.4

0.6

0.8

Asset Correlation

Th

e V

alu

e o

f th

e F

irst

-to

-De

fau

lt

5-Asset20-Asset

26

Page 97: Selection of Research Material relating to RiskMetrics Group CDO Manager

Table 2:The Value of the First, Second and Third-to-Default when the Con-stant Pairwise Asset Correlation is 0.25

Number of Assets First Second Third1 0.1648 NA NA2 0.2835 0.0462 NA3 0.3731 0.1042 0.01724 0.4433 0.1627 0.04575 0.4997 0.2177 0.08016 0.5460 0.2681 0.11697 0.5847 0.3138 0.15398 0.6175 0.3551 0.19019 0.6456 0.3923 0.224810 0.6701 0.4260 0.257711 0.6914 0.4565 0.288712 0.7103 0.4842 0.317913 0.7270 0.5095 0.345314 0.7419 0.5325 0.370915 0.7554 0.5537 0.394916 0.7675 0.5732 0.417517 0.7786 0.5911 0.438618 0.7886 0.6077 0.458419 0.7978 0.6231 0.477020 0.8062 0.6374 0.494521 0.8140 0.6507 0.511022 0.8212 0.6631 0.526623 0.8279 0.6747 0.541324 0.8341 0.6856 0.555125 0.8398 0.6958 0.568326 0.8452 0.7053 0.580727 0.8502 0.7144 0.592528 0.8550 0.7229 0.603729 0.8594 0.7309 0.614330 0.8636 0.7385 0.624435 0.8811 0.7711 0.668440 0.8947 0.7966 0.703645 0.9054 0.8171 0.732350 0.9141 0.8339 0.7561

27

Page 98: Selection of Research Material relating to RiskMetrics Group CDO Manager

Table 3:The Value of the First, Second and Third-to-Default when the Con-stant Pairwise Asset Correlation is 0.5

Number of Asset First Second Third1 0.1648 NA NA2 0.2621 0.0676 NA3 0.3292 0.1277 0.03754 0.3796 0.1781 0.07745 0.4194 0.2205 0.11456 0.4519 0.2567 0.14817 0.4793 0.2881 0.17838 0.5027 0.3156 0.20569 0.5230 0.3399 0.230210 0.5409 0.3617 0.252711 0.5569 0.3814 0.273212 0.5712 0.3993 0.292113 0.5842 0.4156 0.309514 0.5960 0.4306 0.325615 0.6068 0.4444 0.340616 0.6168 0.4573 0.354617 0.6260 0.4692 0.367718 0.6345 0.4804 0.380019 0.6425 0.4908 0.391520 0.6500 0.5007 0.402421 0.6570 0.5099 0.412822 0.6636 0.5186 0.422523 0.6698 0.5269 0.431824 0.6757 0.5348 0.440725 0.6812 0.5422 0.449126 0.6865 0.5493 0.457127 0.6915 0.5560 0.464828 0.6963 0.5625 0.472129 0.7009 0.5687 0.479230 0.7052 0.5746 0.485935 0.7244 0.6008 0.516140 0.7402 0.6226 0.541345 0.7535 0.6411 0.562850 0.7650 0.6570 0.5815

28

Page 99: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 8CreditMetrics® Monitor

April 1999

Toru Tanaka, VP Risk ManagementFuji Bank

Sheikh Pancham PhD, AVPRisk ManagementFuji Bank

[email protected]

Tamunoye Alazigha, AVPRisk Monitoring and ControlFuji Bank

This paper illustrates how CreditManager is applied to perform a Worst Loss analysis on the under-lying reference portfolio of J.P. Morgan’s BISTRO. The BISTRO is a synthetic Collateralized Loan Obligation (CLO), one of a growing number of structured credit risk products being developed by banks to address Loan Portfolio Management issues. Traditionally, once transactions are originated for the credit portfolio, banks have adopted a ‘buy and hold’ approach due to the illiquid secondary markets for such positions. More recently however, banks have recognized that holding credit posi-tions to maturity result in risk/return inefficiencies from burdensome regulatory capital requirements and relationship constraints. Solutions to eliminating these inefficiencies have come in the form of products such as the BISTRO.

In a standard CLO, the originating bank assigns its drawn/funded loans to a Special Purpose Vehicle (SPV), which, in turn issues several classes of credit-tranched notes to capital market investors. Loss-es realized on loan transactions are passed to investors via the tranches which represent ownership of the transactions. Synthetic CLOs, on the other hand, make use of credit derivatives contracts to transfer the credit risk of a loan portfolio, rather than through the sale of transactions (as the standard CLO structure). In this way, only the risk, but not the ownership of the underlying exposures is trans-ferred.

Arguably, J.P. Morgan’s BISTRO has been the most active synthetic CLO issue. It is aimed at insti-tutional spread investors. The BISTRO SPV (Trust) offers two levels of credit-tranched notes in ad-dition to having an equity reserve account. Investors’ proceeds are used to purchase Treasury Notes paying a fixed coupon and maturing on the BISTRO maturity date. At the same time, the BISTRO SPV enters into a credit default swap with Morgan Guaranty Trust (MGT) referencing the underlying credits in a pool of companies, each with a specified notional amount. Under the terms of the swap, MGT pays the BISTRO Trust a fixed semi-annual payment comprising the spread between the Trea-sury coupons and the coupons promised on the issued BISTRO Notes. In return, at the Notes’ matu-rity the trust compensates MGT for losses experienced as a result of credit events. Investors are not in a first-loss position with respect to this portfolio of credit risk. Payments are made by the BISTRO Trust only after, and to the extent that, losses due to credit events have exceeded the first-loss thresh-old (the equity reserve account). Credit events are based on ISDA credit swap definitions including bankruptcy, failure to pay, cross acceleration, restructuring or repudiation. Losses are computed ei-ther by a computation of final work-out value for companies emerging from bankruptcy prior to ma-turity or by soliciting bids from the market for senior unsecured obligations, and are allocated to the two tranches according to seniority. To date, there have been three outstanding issuances of BIS-TROs. This analysis focuses on the BISTRO Trust 1997-1000. Table 1, Table 2, and Chart 1 provide summary information on this issue

Table 1.BISTRO Trust 1997-1000 Tranches.

Description Amount (US$M) Coupon Rating

Super-Senior Tranche 8,993 Not Securitized

Senior Notes 460 6.35% Aaa (Moody’s)

Subordinated Notes 237 9.50% Ba2 (Moody’s)

Equity Reserve Account 32 Not Securitized

Worst Loss Analysis of BISTRO Reference Portfolio

Page 100: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 9CreditMetrics® Monitor

April 1999

Table 2 BISTRO Trust 1997-1000 Underlying Portfolio.

Chart 1BISTRO Structure and Cash Flow.

An institutional investor in the BISTRO Notes has exposure to the portfolio of underlying reference credits. The exposure over the term (4 years remaining) of the BISTRO is examined using existing functionality in CreditManager. The methodology to do this involves a few simple steps:

1. Data on the underlying credits is obtained from the Credit Derivatives Group at J.P. Morgan. The data set includes reference names, notional amounts, credit ratings, and country and industry classifications. Credit ratings, and country and industry classifications are neces-sary inputs for creating the Obligors import file.

2. A 4-year default/transition probability matrix is created manually in CreditManager using data from Moody’s credit research reports (or S&P).

3. In preparing the Exposures file for import, the following inputs are used:

(a) Asset Type is set to Credit Def. Swap since the BISTRO can be viewed as a basket of default swaps on the underlying reference credits.

(b) Maturity of the underlying reference asset is set equal to the maturity of the BISTRO.

Face Amount US$9.72B

Reference Credits 307 Senior Unsecured Obligations of US, European, and Canadian Companies

Maturity December 31, 2002

Collateral 5.625% US Treasury Notes

BISTROSPV

Treasury Notes(Collateral)

matures with BISTRO

SeniorTrancheInvestors

Morgan Guaranty Trust

Reference Pool307 Companies

US$9.72B NotionalSeniorTranche

SubordinatedTranche

First-LossEquity Reserve

(0.33%)

6.35% coupon

9.50% coupon

Investors proceedsused to buy T-Notes

5.625% coupon

+ spread from MGTpays investors

+ Face amount- realized lossesat maturity

+ Face amount- realized lossesat maturity

0.725% of senior notional3.875% of subordinated notional

SubordinatedTrancheInvestors

Realized losses due to credit eventspaid at BISTRO maturity

Worst Loss Analysis of BISTRO Reference Portfolio

Page 101: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 10CreditMetrics® Monitor

April 1999

(c) Seniority Class is set to SU (Senior Unsecured). (d) Information on Coupon Rate or Spread of the underlying reference is not disclosed and is therefore proxied by yields obtained from credit spread curves for senior unsecured obliga-tions.

(e) The input for the Swap Spread Premium is obtained from a single asset credit default swap pricing model developed by Quantserv’s CreditPricer (based on Jarrow-Turnbull meth-odology). The Swap Spread Premium can also be proxied by market quotes on asset swap spreads from sources such as Bloomberg.

4. The Simulation Summary Report is run in CreditManager with Preferences set to 4-year horizon. The important output statistics for our purposes in this report are the Change due to Default and the Losses from the Mean at the 5th, 1st, and 0.1th Percentile levels.

The Change due to Default is basically the Expected Default Loss experienced by the portfolio given the exposure amounts, credit ratings, default probabilities, recovery rates, and correlations among the underlying reference credits. The Expected Default Loss is expected to accumulate up to the horizon. The 10th Percentile Loss from the Mean describes the loss that is expected to occur not more than 10 percent of the time, and so forth for losses from the mean at the 5th, 1st, and 0.1th percentile levels.

Because the analysis horizon is set to 4 years (the remaining maturity of the BISTRO), the sum of the Expected Default Loss and each Percentile Loss from Mean captures an increasing portion of the loss that can accumulate after 4 years under worst case scenarios. Investors are indeed interested in examining the accumulated loss after 4 years because contingent payments by investors to the BIS-TRO counterparty are due only at maturity regardless of the timing of default losses or other credit events affecting underlying reference credits in the portfolio.

The insight to be gained from examining the sum of the Expected Default Loss and the Percentile Losses from Mean statistics comes from the analysis presented in Table 3.

Table 3Analysis of BISTRO 1997-1000.

BISTRO Reference Portfolio Face Amount (USD): $9.72B

Expected Default Loss (EDL) at BISTRO maturity: $24M

First-Loss Equity Reserve Account (0.33% of face amount): $32M

Percentile Loss

10 5 1 0.1s

Loss from Mean $42M $92M $267M $648M

EDL plus Loss from Mean less Equity

$34M $84M $259M $640M

Worst Loss Analysis of BISTRO Reference Portfolio

Page 102: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 11CreditMetrics® Monitor

April 1999

Investors are accountable for portfolio losses incurred less the First-Loss Equity Reserve Account. At the 10th percentile level from the mean, it is likely that a portfolio loss of $42M will not be ex-ceeded. Summing this amount with the Expected Default Loss and reducing by the First-Loss Equity Reserve (42+24-32 = 34) gives $34M. There is a 90% chance that a loss of such magnitude will not occur. First-Loss Equity Reserve aside, investors in the Subordinated Tranche must bear contingent payments up to $237M, with subsequent additional payments up to $460M coming from Senior Tranche investors. Thus, given current market conditions, there is a 90% likelihood that 85% (34/237) of the Subordinated Notes Principal will be returned to investors, and all of the Senior Notes Principal.

Similarly, considering the results at the 5th, 1st, and 0.1th percentile levels from the mean: Only 5% of the time is 35% (84/237) of Subordinated Principal expected to be lost. There is only a 1% chance that the entire Subordinated Principal will be wiped out (loss of $259M at 99 percentile), thus, there is almost a 99% chance that the Senior Principal will remain intact through to maturity. For compar-ison, the historical four year default rate for Ba-rated bonds is about 9%.1 In a 1-in-a-1000 worst case or at the 0.1th percentile level, 88% of the Senior Principal ((640-237)/460) will suffer along with all of the Subordinated Principal. This is roughly consistent with the historical four year default experi-ence for Aaa-rated bonds.

The upshot from the results of this Worst Loss Analysis of the BISTRO reference portfolio is that the tranches have a great likelihood of enduring until maturity and thus investors will receive their full coupon (or yield to maturity). Investors in the BISTRO can perform this type of worst loss analysis periodically using updated credit ratings for the underlying names and market data from the Credit-Metrics web site. Further research on the BISTRO can, perhaps, attempt to relate the coupons earned by the tranches to the losses incurred for various scenarios.

1 See for example Carty, Lea and Dana Lieberman, “Corporate Bond Defaults and Default Rates 1938-1995,” Moody’s Investors Service, 1996.

Worst Loss Analysis of BISTRO Reference Portfolio

Page 103: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 34CreditMetrics® Monitor

April 1999

David X. LiRiskMetrics [email protected]

The rapidly growing credit derivative market has created a new set of financial instruments whichcan be used to manage the most important dimension of financial risk - credit risk. In addition to thestandard credit derivative products, such as credit default swaps and total return swaps based upon asingle underlying credit risk, many new products are now associated with a portfolio of credit risks.A typical example is the product with payment contingent upon the time and identity of the first orsecond-to-default in a given credit risk portfolio. Variations include the instruments with paymentcontingent upon the cumulative loss before a given time in the future. The equity tranche of a CBO/CLO is yet another variation, where the holder of equity tranche incurs the first loss. Deductibles andstop-loss in insurance products could also be incorporated into the basket credit derivatives structure.As the CBO/CLO market continues to expand, the demand for basket credit derivative products willmost likely continue to grow. For simplicity’s sake, we refer to all of these products as basket creditderivatives.

In the last issue of the CreditMetrics Monitor, Finger (1998) studies how to incorporate the buildingblock of credit derivatives, credit default swaps, into the CreditMetrics model. This article furtherexplores the possibility of incorporating more complicated basket credit derivatives into the Credit-Metrics framework.

The actual practice of basket product valuation presents a challenge. To arrive at an accurate valua-tion, we must study the credit portfolio thoroughly. CreditMetrics provides a framework to study thecredit portfolio over one period. In this paper, we extend the framework to cover multi-period prob-lems using hazard rate functions. The hazard rate function based framework allows us to model thedefault time accurately - a critical component in the valuation of some basket products, such as firstor second-to-default credit derivatives.

To properly evaluate a credit portfolio, it is critical to incorporate default correlation. Surprisingly,existing finance literature fails to adequately define or discuss default correlation, defining it only asbeing based on discrete events, which dichotomize according to survival or non-survival at a criticalperiod (one year, for example). We extend the framework by defining the correlation as the correla-tion between two survival times.

Additionally, we will explicitly introduce the concept of copula function which, in the current Credit-Metrics framework, has only been used implicitly. We will give some basic properties and commoncopula functions. Using a copula function and the marginal distribution of survival time of each cred-it in a credit portfolio, we will build a multivariate distribution of the survival times. Using this dis-tribution, we can value any basket credit derivative structure.

The remainder of this article is organized as follows: First, we present a description of some typicalbasket credit derivative instruments. Second, we explore the characterization of default using hazardrate functions. Third, we correlate credit risks using copula functions. Finally, we present numericalexamples of basket valuation.

Instrument Description

There are a variety of basket type credit derivative products, which we classify into the followingthree broad categories.

The Valuation of Basket Credit Derivatives

Page 104: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 35CreditMetrics® Monitor

April 1999

Type I. Payment associated with the total loss over a fixed period of time

This product has a contingent payment based upon the total loss of a given credit portfolio over afixed time horizon, such as one year. An investor who worries about her investment on an equitytranche of a CBO or CLO can buy a contract which pays the loss over a certain amount or deductible.Suppose is the cumulative total loss of a given credit portfolio at a prefixed time in the future,such as 1 year, and is the deductible. The payoff function for this instrument at time is

[1]

We could also have a stop-loss type payoff function, in which the total loss up to a given amount ispaid by the contract purchaser. In this or the basic case, the payment is made at the end of the periodand the payment only depends upon the cumulative loss at period end. Thus, this type of product issimilar to European options.

Type II. Payment is associated with the cumulative loss across time

For this type of contract the payment is associated with the evolution of loss function across time.For example, the contract could be structured in such a way that the payment starts if the cumulativeloss of a given credit portfolio becomes larger than a lower limit , and the payment continues afterthis point whenever new loss occurs until the cumulative loss reaches a upper limit . Suppose wehave a portfolio which has a loss of 10, 13, 2, 0, 19 (in millions) in its first 5 years, and a lower limitof $15 million and an upper limit of $45 million. The payment of the loss protection would be 0, 8,8, 0, 14.

Table 1

The loss and payment of this product is shown in Table 1, which illustrates the necessity of trackingthe loss function. at different times in the future for this product type.

Type III. Payment is associated with the time and identity of the default.

The payment of this product depends upon the time and identity of the first few defaults of a givencredit portfolio. For example, the first-to-default contract pays an amount either prefixed or associ-ated with the identity of the first defaulted asset in a given credit portfolio. A portfolio managerchooses a portfolio of three credit risks of face values $100, $150 and $200 million respectively, andbuys credit protection against the first-to-default of the portfolio. If one of the credit defaults, she

An Example of Loss and Payment of a Type II Product.

Year

1 2 3 4 5

Loss 10 13 8 0 19

Cumulative loss 10 23 31 31 50

Payment 0 8 8 0 14

Lt tD t

P0 if Lt D,≤,

Lt D if Lt D.>,–

=

LH

Lt

The Valuation of Basket Credit Derivatives

Page 105: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 36CreditMetrics® Monitor

April 1999

receives the face amount of defaulted asset, delivers the defaulted asset to the first-to-default protec-tion seller, and the credit default swap terminates.

The cash flows for a first-to-default transaction are represented in Chart 1. If we suppose a portfoliomanager - a North American insurance company, for example - thinks the three underlying assets areuncorrelated and believes that it is highly unlikely that more than one will default, the company canbuy a first-to-default protection from an European investor who seeks diversified credit exposure inNorth America.1

Chart 1 Cashflows for a first-to-default structure.

A Hazard Rate Function Approach to Default Studies

CreditMetrics provides a one-period framework in which we can obtain the profile of a credit port-folio at the end of one period, such as one year. This framework is sufficient to value the Type I basketproduct and any derivative instruments with payment contingent upon the credit portfolio value atperiod end.

However, for Type II and III basket products, we need to model either the cumulative loss functionacross time or the default times of a given credit portfolio. To this end, we introduce some standardtechniques of survival analysis in statistics to extend our one-period framework to a multi-periodone. Survival analysis has been applied to other areas, such as life contingencies in actuarial science,industry life testing in reliability studies, which share a lot of similarities with the credit problemswe encounter here. We introduce a random variable to denote each credit’s survival time (the timefrom today until the time of default2), which plays an important role in the valuation of any creditstructures. The characterization of this variable is usually by a hazard rate function. The next sectionprovides some basic hazard rate function concepts.

1 It is generally best to use credit risks with similar credit ratings and notional amounts. Otherwise, a very weak credit risk or a credit with large notional amount would dominate the basket pricing.

2 The survival time is infinity if default never occurs.

Credit Protection

Seller

Portfolio Manager

Protection buyer

Reference Assets

AA, BB, CC

x basis points per annum

Par less recovery amountfollowing the first default

The Valuation of Basket Credit Derivatives

Page 106: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 37CreditMetrics® Monitor

April 1999

The Characterization of Default

In the study of default, interest centers on a group of individual companies for each of which there isdefined a point event, often called default, (or survival) occurring after a length of time. We introducea random variable called the time-until-default, or simply survival time, for a security , to de-note this length of time. This random variable is the basic building block for the valuation of cash-flows subject to default. To precisely determine time-until-default we need:

• an unambiguously defined time origin;

• a time scale for measuring the passage of time;

• a clear definition of default.

We choose the current time as the time origin to allow use of current market information to buildcredit curves. The time scale is defined in terms of years for continuous models, or number of periodsfor discrete models. For defaults, we must select from the various definitions provided by the ratingagencies and the International Swap Dealers Association.

Survival Function

Consider a credit the survival time of which is . For notational simplicity we omit the use ofthe subscript .The probability distribution function of the survival time can be specified by thedistribution function

[2] ,

which gives the probability that default occurs before . The corresponding probability density func-tion is . Either distribution specification can be used, whichever proves more con-venient. In studying survival data it is useful to define the survival function

[3] ,

giving the upper tail area of the distribution, that is, the probability that the credit survives to time .Usually, , which implies since survival time is a positive random variable. Thedistribution function and the survival function provide two mathematically equivalentways of specifying the distribution of the survival time. There are, of course, other alternative waysof characterizing the distribution of . The hazard rate function, for example, gives the credit’s de-fault probability over the time interval if it has survived to time

[4] .

The hazard rate function

[5] ,

TA A

A TAA T

F t( ) Pr T t≤{ }=

tf t( ) dF t( ) dt⁄=

S t( ) 1 F t( )– Pr T t>{ }= =

tF 0( ) 0= S 0( ) 1=

F t( ) S t( )

Tx x ∆t+,[ ] x

Pr x T x ∆t T x>+≤<{ } F x ∆t+( ) F x( )–1 F x( )–

----------------------------------------- f x( )∆t1 F x( )–--------------------≅=

h x( ) f x( )1 F x( )–--------------------=

The Valuation of Basket Credit Derivatives

Page 107: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 38CreditMetrics® Monitor

April 1999

has a conditional probability density interpretation as the probability density function of default atexact age , given survival to that time. The relationship between the hazard rate function, the dis-tribution function, and the survival function is

[6] .

The survival function can then be expressed in terms of the hazard rate function using

[7] .

Using the hazard rate function, we can calculate different default and survival probabilities. For ex-ample, we can calculate the conditional probability that a credit survives to year and then defaultsduring the following years as follows:

[8] .

If the series for gives the conditional default probability of a credit in thenext year if the credit survives to the year .3 The probability density function of survival time of acredit can also be expressed using the hazard rate function as follows:

[9] .

A typical assumption is that the hazard rate is a constant, , over certain period, such as .In this case, the density function is

[10] ,

which shows that the survival time follows an exponential distribution with parameter . Under thisassumption, the survival probability over the time interval is

[11] ,

where is the probability of survival over one year period. This assumption can be used to scaledown the default probability over one year to a default probability over a time interval less than oneyear.

The aforementioned result can be found in survival analysis books, such as Bowers (1986).

3 Following the actuarial convention, we simply omit the subscript when .

x

h x( ) f x( )1 F x( )–-------------------- S′ x( )

S x( )------------–= =

S t( ) h s( ) sd0

t

∫–exp=

xt

qt x 1 h x s+( ) sd0

t

∫–exp–=

t 1= qt x x 0 1 …n, ,=x

t t 1=

f t( ) h t( ) h x s+( ) sd0

t

∫–exp⋅=

h x x 1+,[ ]

f t( ) heht–=

hx x t+,[ ] for 0 t 1≤<,

pt x h s( ) sd0

t

∫–exp eht–

px( )t= = =

px

The Valuation of Basket Credit Derivatives

Page 108: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 39CreditMetrics® Monitor

April 1999

Building Survival Functions in Practice

In the last section we introduced a new random variable to denote the survival time for each credit.We also presented various equivalent ways to describe the new random variable. As in traditionalsurvival analysis, we use the hazard rate function to describe credit’s default property. If we knowthe hazard rate function, we can calculate any default, survival probability or expected default time.In this section we discuss how to obtain the hazard rate function in practice.

There exist three methods to obtain the term structure of default rates:

• Obtaining historical default information from rating agencies;

• Taking the Merton option theoretical approach;

• Taking the implied approach using market price of defaultable bonds or asset swap spreads.

Rating agencies like Moody’s publish historical default rate studies regularly. In addition to the com-monly cited one-year default rates, they also present multi-year default rates. From these rates wecan obtain the hazard rate function. For example, Moody’s publishes weighted average cumulativedefault rates from 1 to 20 years. For the B rating, the first 5 years’ cumulative default rates in per-centage are 7.27, 13.87, 19.94, 25.03 and 29.454. From these rates we obtain the marginal conditionaldefault probabilities: The first marginal conditional default probability in year one is simply the one-year default probability, 7.27%. The other marginal conditional default probabilities can be obtainedusing the following formula:

[12] ,

which simply states that the probability of default over time interval [0, n+1] is the sum of the prob-ability of default over the time interval [0, n], plus the probability of survival to the end of th year,and then default during the following one year. Using Eq. [12] we have the marginal conditional de-fault probability:

[13] ,

which results in the marginal conditional default probabilities in year 2, 3, 4, 5 as 7.12%, 7.05%,6.36% and 5.90%. If we assume a piece wise constant hazard rate function over each year, then wecan obtain the hazard rate function using Eq. [8]. The hazard rate function is given in Chart 2.

4 These numbers are taken from Carty and Lieberman (1997).

qn 1+ x qx px qx n+⋅n+n

=

n

qn 1+

qn 1+ x qn x–

1 qn x–---------------------------=

The Valuation of Basket Credit Derivatives

Page 109: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 40CreditMetrics® Monitor

April 1999

Chart 2Hazard rate function for B rating.

This hazard rate function is a decreasing function of time, which implies that the further into future,the lower the marginal default probability. Intuitively, B rated credit tends to be upgraded rather thanto be downgraded, as grade B is the lowest grade in Carty and Lieberman’s (1997) cumulative defaultrate table.

Merton’s option-based structural models could also produce a term structure of default rates. Usingthe above, we can derive the hazard rate function.

Alternatively, we can take the implicit approach by using market observable information, such as theasset swap spreads or risky corporate bond prices. This is the approach used by most credit derivativetrading desks. The extracted default probabilities reflect the market-agreed perception today aboutthe future default tendencies of the underlying credit. For details about this approach, we refer to Li(1998).5

Correlating Survival Times Using Copula Functions

Let us study some problems of an credit portfolio. Using either the historical approach or the im-plicit approach, we can construct the marginal distribution of survival time for each of the creditrisks in the portfolio. If we assume mutual independence among the credit risks, we can study anyproblem associated with the portfolio. However, the independence assumption of the credit risksis obviously not realistic; in reality, a group of credits’ default rate tends to be higher in a recessionand lower when the economy is booming. This implies that each credit is subject to the same set ofmacroeconomic environment, and that there exists some form of positive dependence among the

5 The hazard rate function is usually called a credit curve due to the analogy between the yield curve and the hazard rate function.

0 1 2 3 4 5Years

6%

6.5%

7%

7.5%

8%

Hazard rate

nn

nn

The Valuation of Basket Credit Derivatives

Page 110: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 41CreditMetrics® Monitor

April 1999

credits. In order to introduce correlation structure into the portfolio, we must determine how to spec-ify a joint distribution of survival times, with given marginal distributions.

Obviously, this problem has no unique solution. Generally speaking, knowing the joint distributionof random variables allows us to derive the marginal distributions and the correlation structureamong the random variables, but not vice versa. There are many different techniques in statisticswhich allow us to specify a joint distribution function with given marginal distributions and a corre-lation structure. Among them, a copula function is a simple and convenient approach. We give a briefintroduction to the concept of a copula function in the next section.

Definition and Basic Properties of a Copula Function

A copula function (the function below) links or marries univariate marginals to their full multi-variate distribution. For uniform random variables, , the joint distribution functionis expressed as6

[14] .

We see that a copula function is just a cumulative multivariate uniform distribution function. For giv-en univariate marginal distribution functions , the function

[15] ,

which is defined using a copula function , results in a multivariate distribution function withunivariate marginal distributions as specified.

This property can be shown as follows:

[16]

The marginal distribution of is

[17]

Sklar (1959) established the converse. He showed that any multivariate distribution function canbe written in the form of a copula function. He proved the following: if is a jointmultivariate distribution function with univariate marginal distribution functions ,

, then there exists a copula function such that

6 The function also contains correlation information which we do not explicitly express here for simplicity’s sake.

nn

Cm U1 U2 … Um, , ,

C u1 u2 … um, , ,( ) Pr U1 u1 U2 u2 … Um um<, ,<,<{ }=

F1 x1( ) F2 x2( ) … Fm xm( ), , ,

F x1 x2 … xm, , ,( ) C F1 x1( ) F2 x2( ) … Fm xm( ), , ,( )=

C

C F1 x1( ) F2 x2( ) … Fm xm( ), , ,( ) Pr U1 F1 x1( ) U2 F2 x2( ) … Um Fm xm( )<, ,<,<{ } = Pr F1

1–U1( ) x1 F2

1–U2( ) x2 … F, m

1–Um( ) xm≤,≤,≤{ }

= Pr X1 x1 X2 x2 … Xm xm<, ,<,<{ }.

=

Xi

C F1 ∞( ) F2 ∞( ) … Fi xi( ) … Fm ∞( ), , , , ,( )Pr X1 ∞ X2 ∞ … Xi xi … Xm, , ∞< <, ,<,<{ }Pr Xi xi<{ }.

==

FF x1 x2 … xm, , ,( )

F1 x1( ) F2 x2( ),… Fm xm( ), C u1 u2 … um, , ,( )

The Valuation of Basket Credit Derivatives

Page 111: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 42CreditMetrics® Monitor

April 1999

[18] .

If each is continuous, then is unique.7 Thus, copula functions provide a unifying and flexibleway to study multivariate distributions.

For simplicity’s sake, we discuss only the properties of bivariate copula function for uni-form random variables defined over the area , where is a cor-relation parameter. We call simply a correlation parameter since it does not necessarily equal theusual correlation coefficient defined by Pearson, nor Spearman’s Rho, nor Kendall’s Tau.8 The bi-variate copula function has the following properties:

• Since and are positive random variables, .

• Since and are bounded above by 1, then the marginal distributions can be obtained asfollows .

• For independent random variables and , .

Frechet (1951) showed there exist upper and lower bounds for a copula function:

[19] .

The multivariate extension of Frechet bounds is given by Dall'Aglio (1972).

Some Common Copula Functions

We present a few copula functions commonly used in biostatistics and actuarial science.

The Frank Copula function is defined as:

[20] .

The Bivariate Normal function is:

[21] ,

where is the bivariate normal distribution function with correlation coefficient , and is theinverse of a univariate normal distribution function. As we shall see later, this is the copula functionused in CreditMetrics.

7 For the proof of this theorem we refer to Sklar (1959).

8 We define Spearman’s Rho and Kendall’s Tau later.

F x1 x2 … xm, , ,( ) C F1 x1( ) F2 x2( ) … Fm xm( ), , ,( )=

Fi C

C u v ρ, ,( )U and V u v,( ) 0 u 1 0 v 1≤<,≤<{ } ρ

ρ

U V C 0 v ρ, ,( ) C u 0 ρ, ,( ) 0= =

U VC 1 v ρ, ,( ) v C u 1 ρ, ,( ), u= =

U V C u v ρ, ,( ) uv=

max 0 u v 1–+,{ } C u v,( ) min u v,{ }≤≤

C u v,( ) 1α--- 1 e

αu1–( ) e

αv1–( )

1–--------------------------------------------+ , ∞ α ∞< <–ln=

C u v,( ) Φ2 Φ 1–u( ) Φ 1–

v( ) ρ,,( )=

Φ2 ρ Φ 1–

The Valuation of Basket Credit Derivatives

Page 112: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 43CreditMetrics® Monitor

April 1999

The Bivariate Mixture Copula Function is formed using existing copula functions. If the two uniformrandom variables and are independent, we have a copula function . If the two ran-dom variables are perfect correlated we have the copula function . Mixing thetwo copula functions by a mixing coefficient we obtain a new copula function as follows:

[22] .

If we have

[23] ,

where

[24]

is an indicator function.

Copula Function and Correlation Measurement

To compare different copula functions, we need to have a correlation measurement independent ofmarginal distributions. The usual Pearson’s correlation coefficient, however, depends on the marginaldistributions (see Lehmann (1966)). Both Spearman’s Rho and Kendall’s Tau can be defined using acopula function only as follows:

[25] ,

[26] .

Comparison between results using different copula functions should be based on either a commonSpearman's Rho or on Kendall’s Tau.

Further examination of copula functions can be found in a survey paper by Frees and Valdez (1988).

The Calibration of Default Correlation of Survival Times

Having chosen a copula function, we need to have the pairwise correlation of all survival times. Us-ing CreditMetrics asset correlation approach, we can obtain the default correlation of two discreteevents over one year period. As it happens, CreditMetrics uses the normal copula function in its de-fault correlation formula even though it does not use the concept of copulas explicitly.

First, let us summarize how CreditMetrics calculates the joint default probability of two credits, Aand B. Suppose the one year default probabilities for A and B are and . CreditMetrics uses thefollowing steps:

u v C u v,( ) uv=C u v,( ) min u v,( )=

ρ 0>

C u v,( ) 1 ρ–( )uv ρ min u v,{ }, if ρ 0>⋅+=

ρ 0≤

C u v,( ) 1 ρ+( )uv ρ u 1– v+( )∆ u 1– v+( ), ρ 0≤–=

∆ x( ) 1, if x 0,>0, if x 0.≤

=

ρs 12 C u v,( ) uv–[ ] ud vd∫∫=

τ 4 C u v,( ) C u v,( ) 1–d∫∫=

qA qB

The Valuation of Basket Credit Derivatives

Page 113: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 44CreditMetrics® Monitor

April 1999

• Obtain and such that and , where is a stan-dard normal random variable.

• If is the asset correlation, the joint default probability for credit A and B is calculated asfollows:

[27] ,

where is the standard bivariate normal density function with a correlation coeffi-cient , and is the bivariate cumulative normal distribution function.

If we use a bivariate normal copula function with a correlation parameter , and denote the survivaltimes for B and A as and , the joint default probability of A and B, which is the probabilitythat , can be calculated using

[28] ,

where and are distribution functions for and . Noting

[29] ,

we see that Eq. [27] and Eq. [28] give the same joint default probability over one year period if.

We can conclude that CreditMetrics uses a bivariate normal copula function with the asset correlationas the correlation parameter in the copula function. Thus, to generate survival times of two creditrisks, we use a bivariate normal copula function with correlation parameter equal to the CreditMet-rics asset correlation. We note that this correlation parameter is not the correlation coefficient be-tween the two survival times and . The correlation coefficient between and is muchsmaller than the asset correlation. Conveniently, the marginal distribution of any subset of an di-mension normal distribution is still a normal distribution. Using asset correlation, we can constructhigh dimension normal copula functions to model credit portfolio.

Valuation Method

Suppose now that we are to study an credit portfolio problem. For each credit in the portfolio,we have constructed a credit curve or a hazard rate function for its survival time either basedon historical default experience or using implicit approach. We assume that the distribution functionof is . Using a copula function , we also obtain the joint distribution of the survival times

as follows:

[30] .

If we use the normal copula function we have:

ZA ZB qA Pr Z ZA<{ }= qB Pr Z ZB<{ }= Z

ρ

Pr Z1 ZA Z2 ZB<,<{ } φ2 x y ρ, ,( ) xd yd∞–

ZB

∫∞–

ZA

∫ Φ2 ZA ZB ρ, ,( )= =

φ2 x y ρ, ,( )ρ Φ2

κTB TA

TA 1 TB 1<,<

Pr TA 1 TB 1<,<{ } Φ2 Φ 1–FA 1( )( ) Φ 1–

FB 1( )( ) κ,,( )=

FA FB TA TB

qj Pr Tj 1<{ } Fj 1( ) and Zj Φ 1–qj( ) for j A, B= = = =

κ ρ=

TA TB TA TBn

n ihi Ti

Ti Fi t( ) CT1 T2 … Tn, , ,

F t1 t2 …tn, ,( ) C F1 t1( ) F2 t2( ) … Fm tm( ), , ,( )=

The Valuation of Basket Credit Derivatives

Page 114: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 45CreditMetrics® Monitor

April 1999

[31] ,

where is the -dimension cumulative distribution function with correlation matrix .

To simulate correlated survival times, we introduce another series of random variables, such that

[32] .

There is a one-to-one mapping between and . Thus, simulating is equivalent to simulating . As shown in the last section, the correlation

between the s is the asset correlation of the underlying credits. We have the following simulationscheme:

• Simulate from a multivariate normal distribution with correlation matrix .

• Obtain using .

Given full information on the default time and identity, we can track on any loss function, such asthe cumulative loss over a certain period. We can also price the first or second-to-default structuresince the default times can be easily sorted and ranked.

In the special case of independence among , the first-to-default can be valued analyti-cally. Let us denote the survival time for the first-to-default of credits as , i.e.

[33] .

Under the independence assumption the hazard rate function of is

[34] ,

assuming a contract which pays one dollar when the first-default of the n credits occurs within 2years, and the yield is a constant . The present value of the contract is . The survivaltime for the first-to-default has a density function , so the value of the contract canbe calculated as

[35]

However, Eq. [34] will not necessarily hold if are not independent. For example, con-sider the bivariate exponential distribution, the joint survival function of which is given by

[36] ,

where are three parameters. The marginal survival functions for are

F t1 t2 …tn, ,( ) Φn Φ 1–F t1( )( ) Φ 1–

F t2( ) ) … Φ 1–F tn( )( ), ,,( )=

Φn n Γ

nY1 Y2 …Yn, ,

Y1 Φ 1–F T1( )( ), Y2 Φ 1–

F T2( )( ),… Yn Φ 1–F Tn( )( )=,==

T1 T2 … Tn, , , Y1 Y2 … Yn, , ,T1 T2 … Tn, , , Y1 Y2 … Yn, , ,

Y

Y1 Y2 … Yn, , , Γ

T1 T2 … Tn, , , Ti F1– Φ Yi( )( ), i 1 2 … n, , ,= =

T1 T2 … Tn, , ,

T1 T2 … Tn, , ,n T

T min T1 T2 … Tn, , ,{ }=

T

hT h1 h2 … hn+ + +=

r Z 1 erT–⋅=

f t( ) hT ehTt–

⋅=

V 1 erT–

f t( )⋅ td

0

2

∫ = 1 erT–

hT ehTt–

⋅⋅ td

0

2

∫hT

r hT+-------------- 1 2 r hT+( )⋅–[ ]exp–( ).= =

T1 T2 … Tn, , ,

S t1 t2,( ) Pr T1 t1 T2 t2>,>{ } exp λ1t1– λ2t2– λ12max t1 t2,{ }–[ ]= =

λ1 λ2 λ12 0>, , T1 T2,

The Valuation of Basket Credit Derivatives

Page 115: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 46CreditMetrics® Monitor

April 1999

[37] and .

It is easy to show the covariance between and is given by

[38] ,

which implies that are not independent if . It can also be shown that the hazard rate forthe minimum of is , instead of in the case of independence.

Hence for a credit portfolio with a large number of correlated credit risks, we still resort to the MonteCarlo simulation scheme we outlined in this section.

Numerical Examples

The first example illustrates how to value a Type (I) first loss credit derivatives using the CreditMan-ager application. The second shows how to value a first-to-default structure on a portfolio of fivecredits.

Example 1

CreditManager uses a simulation approach to obtain the distribution of a credit portfolio value at theend of a time horizon, such as one year. Using this distribution, we can assess the possible values ofthe portfolio in the future. Using more detailed reports we can also express credit risk broken downby country, industry, maturity, rating, product type, or any other category of credit exposure. Thus,managers can identify different pockets, or concentration of risk within a single portfolio, or acrossan entire firm and take appropriate action to actively manage the portfolio. As the credit derivativemarket develops, portfolio managers can use the new credit derivative instruments to accomplishtheir goals.

As an example, we run a simulation on the CreditManager sample portfolio. The simulation summary is given in Chart 3. We see that the sample portfolio is a pretty “good” credit portfolio in the sense that the distribution of the portfolio value is very much concentrated on the mean and has a relatively smaller standard deviation. Regardless, if the portfolio manager still wants to buy a credit derivative to protect his portfolio from declining to a level below , how much should be paid for the protec-tion?

The payoff function of the structure is

[39]

where is the portfolio value at the end of one year. The expected value of is

S t1( ) exp λ1 λ12+( )t1–[ ]= S t2( ) exp λ2 λ12+( )t2–[ ]=

T1 T2

Cov T1 T2,[ ]λ12

λ1 λ2 λ12+ +( ) λ1 λ12+( ) λ2 λ12+( )---------------------------------------------------------------------------------------=

T1 T2, λ12 0≠T1 T2, λ1 λ2 λ12+ + λ1 λ2+

C

P 0 if X C,>,C X if X C,≤,–

=

X P

The Valuation of Basket Credit Derivatives

Page 116: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 47CreditMetrics® Monitor

April 1999

[40]

where is the probability density function of the sample portfolio value at the end of 1 year.

Chart 3Simulated distribution of the CreditManager sample portfolio.

To calculate the payment premium, we need to evaluate only Eq. [40] and discount the result to thepresent. For simplicity’s sake, we assume the discount factor is 1.0. CreditManager can export thedensity function of the simulated portfolio value into a spreadsheet, from which we can evaluateEq. [40].

If the portfolio manager wants to cover the portfolio value from falling below 10th percentile lossfrom the mean value of 128.41 million, the premium she needs to pay is approximately $62,547. Thepremium to protect other percentile losses can be calculated similarly; the results are depicted inChart 4.

Example 2

The second example shows how to value a first-to-default contract. We assume we have a portfolioof credits. For simplicity’s sake, we assume each credit has a constant hazard rate of for

. From Eq. [10] we know the density function for the survival time is .This shows that the survival time is exponentially distributed with mean . We also as-sume that the credits have a constant pairwise asset correlation .9

9 To have a positive definite correlation matrix, the constant correlation coefficient has to satisfy the condition .

E P[ ] C x–( )f x( ) x,d0

C

∫=

f x( )

n h0 t ∞< < f t( ) h ht–[ ]exp⋅=

E T[ ] 1 h⁄=n ρ

ρρ 1 n 1–( )⁄–>

The Valuation of Basket Credit Derivatives

Page 117: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 48CreditMetrics® Monitor

April 1999

Chart 4Premium as a function of percentile level protected.

The contract is a two-year transaction which pays one dollar if the first default occurs during the firsttwo years. We also assume a constant interest rate of 10%. If all the credits in the portfolio are inde-pendent, the hazard rate of the minimum survival time is and the contract value is given byEq. [35].

We choose the following basic set of parameters: .

First, we examine the impact of the number of assets on the value of the first-to-default contract. Ifthere is only one asset, the value of the contract should be 0.1648. As the number of assets increases,the chance that there is one default within the two years also grows, as does the value of the first-to-default contract. Chart 5 shows how the value of the first-to-default changes along with the numberof the assets in the portfolio. We see that the value of the first-to-default contract increases at a de-creasing rate. When the number of assets increases to 15, the value of the contract becomes 0.7533.From Chart 5 we also see that the impact of the number of assets on the value of the first-to-defaultdecreases as the default correlation increases.

Second, we examine the impact of correlation on the value of the first-to-default contract of 5 assets. If , the expected payoff function, based on Eq. [35], should give a value of 0.5823. Our sim-ulation of 50,000 runs gives a value of 0.5830. If all 5 assets are perfectly correlated, then the first-to-default of 5 assets should be the same as the first-to-default of 1 asset since any one default induces all others to default. In this case the contract should worth 0.1648. Our simulation of 50,000 runs pro-duces a result of 0.1638. Chart 6 shows the relationship between the value of the contract and the constant correlation coefficient. We see that the value of the contract decreases as the correlation in-creases. We also examine the impact of correlation on the value of the first-to-default of 20 assets in Chart 6. As expected, the first-to-default of 5 assets has the same value of the first-to-default of 20 assets when correlation approaches 1.

0% 2% 4% 6% 8% 10%Percentile level

10

20

30

40

50

60

70

Premium H$000’sL

Tn nh=

n 5 h, 0.1 ρ, 0.25 r, 0.1= = = =

ρ 0=

The Valuation of Basket Credit Derivatives

Page 118: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 49CreditMetrics® Monitor

April 1999

Chart 5Value of first-to-default as a function of number of assets.Correlation levels of 25% and 50%.

Chart 6Value of first-to-default as a function of correlation level.5 and 20 asset baskets.

0 5 10 15 20Assets

0.2

0.4

0.6

0.8

Value

Correlation= 25%

Correlation= 50%

0% 20% 40% 60% 80% 100%Correlation

0.2

0.4

0.6

0.8

1

Value

5 assets

20 assets

The Valuation of Basket Credit Derivatives

Page 119: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 50CreditMetrics® Monitor

April 1999

Conclusion

We have shown how to value some basket-type credit derivative contracts with the payoff contingentupon the default properties of a portfolio of credits. If the payoff function depends only on the valueof the portfolio at a given time in the future, we can use CreditManager to price the transaction. Forother products with payoff contingent upon either the cumulative loss across time or the order andidentities of default, we introduce a hazard rate function based approach. The hazard rate functionbased approach attempts to model the default time directly, which characterizes default more precise-ly than does the discrete approach over a given period. We also explicitly introduce the concept ofcopula function, and provide the basic concept, properties, and examples, thus demonstrating howthe CreditMetrics framework facilitates the valuation of any credit derivative basket.

References

Gupton, Greg M., Christopher C. Finger, and Mickey Bhatia. CreditMetrics -- Technical Document,New York: Morgan Guaranty Trust Co., 1997.

Bower, Newton et al. Actuarial Mathematics, The Society of Actuaries, 1986.

Carty, Lea and Dana Lieberman. “Historical Default Rates of Corporate Bond Issuers, 1920-1996,”Moody’s Investors Service, January 1997.

Finger, Christopher. “Credit Derivatives in CreditMetrics,” CreditMetrics Monitor, 3rd Quarter1998.

Sklar, A. “Fonction de Repartition a n Dimensions et Leurs Marges,” Publications de L’InstituteStatistique de L’Universite de Paris, 8:229-231, 1959.

Dall’Aglio, G. “Frechet Classes and Compatibility of Distribution Functions,” Symp. Math., 9:131-150, 1972.

Frees, W. Edward and Emiliano A. Valdez. “Understanding Relationship Using Copulas,” NorthAmerican Actuarial Journal, Vol. 2, Num. 1, 1-25, 1998.

Lehmann, E. L. “Some Concepts of Dependence,” Annals of Mathematical Statistics, 37, 1137-1153,1966.

Li, David X. “Constructing a Credit Curve,” Credit Risk, A Risk Special Report, 40-44, 1998.

The Valuation of Basket Credit Derivatives

Page 120: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 14CreditMetrics® Monitor

April 1999

Christopher C. FingerRiskMetrics [email protected]

It is well known that the CreditMetrics model relies on Monte Carlo simulation to calculate the fulldistribution of portfolio value. Taking this approach, independent scenarios are generated in whichthe future credit rating of each obligor in the portfolio is known and correlations are reflected so thathighly correlated obligors, for example, default in the same scenario more frequently than less cor-related obligors. In each scenario, the credit rating of the obligors determines the value of the port-folio; accumulating the value of the portfolio in each scenario allows us to estimate descriptivestatistics for the portfolio, or even to examine the shape of the distribution itself.

While the CreditMetrics Monte Carlo approach is attractive for its flexibility, it suffers from relative-ly slow convergence. Any statistic obtained through Monte Carlo is subject to simulation noise, butthis noise is slower to disappear in our model than it is in the case of models such as RiskMetrics,where the distributions of individual assets are continuous. We shall see that by performing simula-tions as we do currently, we fail to take full advantage of the model’s structure. Specifically, we willsee that once we condition on the industry factors that drive the model, all defaults and rating changesare independent. Though prior studies (Kolyoglu and Hickman (1998), Finger (1998), and Gordy(1998)) have addressed this fact, they have focused more on using this to facilitate comparisons be-tween CreditMetrics and other models. Intuitively, conditional independence is a useful feature,since there is a wealth of machinery upon which we may call to aggregate independent risks. In thisarticle, we will illustrate how to view the CreditMetrics model in a conditional setting, and use con-ditional independence to exploit a variety of established results. This will provide us with a toolboxof techniques to improve on the existing Monte Carlo simulations.

We begin by illustrating the conditional approach with a simple example. Next, we apply three dif-ferent techniques to either approximate or compute directly the conditional portfolio distribution, andshow how these techniques may be used to enhance our Monte Carlo procedure. Finally, we relax theassumptions of the simple example, and show how these techniques may be applied in the generalcase.

The simplest case

To begin, let us assume the simplest possible credit portfolio. We assume a portfolio of loans ofequal size, which are worth zero in the case of default and otherwise.1 Assume each loan has aprobability of defaulting, and let be the normalized asset value change for the th obligor. Thenthe model is simply that the th obligor defaults if , where is the default thresh-old.2 Suppose that each normalized asset value change can be expressed by

[1] ,

where is the (normalized) return on a common market index, is the idiosyncratic movement forthis obligor, and is the common weight of each obligor on the market index. We assume, as always,that are independent, normally distributed random variables, with mean zero and

1 Thus, regardless of the number of loans , the total size of the portfolio is one. The size of the loans is arbitrary, but this formulation will aid our exposition later.

2 Throughout, we will use to denote the cumulative distribution function (CDF), and the density function for the stan-dard normal distribution.

N1 N⁄

N

p Zi ii Zi α< α Φ 1–

p( )=

Φ φ

Zi w Z 1 w2– εi⋅+⋅=

Z εiw

Z ε1 ε2 … εN, , , ,

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 121: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 15CreditMetrics® Monitor

April 1999

variance one. Readers will recognize this as the constant correlation case, with all pairwise asset cor-relations set to .

Under the standard approach, we observe that it is impractical to compute all of the joint default prob-abilities (for instance, the probability that two obligors default together, or that a given four obligorsdefault while seven others do not) directly by integrating the multivariate distribution of asset values.For this reason, we obtain the portfolio distribution through Monte Carlo simulation. We generate alarge number of independent draws of and an equal number of independent draws of ,and then produce the scenarios for the through Eq. [1].3 Given the draws of , we may identifywhich obligors default in each scenario, tabulate the portfolio value in each scenario, and producethe portfolio distribution.

While this is a natural approach, it suffers from a number of drawbacks. First, it is necessary to drawa large number of random numbers -- for 1,000 Monte Carlo trials. This is costly interms of the number of operations and the memory required, and potentially taxing on a random num-ber generator if the number of obligors is very large. Second, as mentioned before, convergence (thatis, the rate estimated statistics approach their true value as a function of the number of scenarios used)is slow. One reason for this, intuitively, is that there is little information in each trial. That is, if were equal to 1% (and thus equal to -2.33), one trial where some is equal to -2, and anotherwhere is equal to 0 are the same (both produce a “no default” for the th obligor) and yet the firsttrial is in a sense “closer” to a default. We would like to exploit the idea that not all non-default trialsare alike to make our simulations more efficient. Another aspect of the slow convergence is that rel-ative risk rankings may be spurious. For instance, in our simple model clearly no loan is more riskythan any other, yet it is certainly plausible that in a set of Monte Carlo trials, some obligors will beidentified as defaulting more frequently than others, which produces the misleading conclusion thatone obligor, on a marginal basis, contributes more risk to the portfolio.

In order to improve our simulation approach, we first observe that once we have fixed the marketfactor , everything else that happens to the obligors is independent. More formally, the obligors areconditionally independent given . The conditional independence will prove crucial, as it transformsthe complex problem of aggregating correlated exposures into the well understood problem of con-volution, or the aggregation of independent exposures. Returning to Eq. [1], we see that if we fix ,the condition that a given obligor defaults becomes

[2] .

Since follows the standard normal distribution, the probability, given , that Eq. [2] occurs is giv-en by

[3] .

3 Equivalently, we could build the correlation matrix for the and generate scenarios using the Cholesky decomposition of this matrix. We choose the other method here mostly for ease of exposition.

w2

Z ε1 ε2 … εN, , ,Zi Zi

Zi

1,000 N 1+( )⋅

pα Zi

Zi i

ZZ

Z

εiα w– Z⋅

1 w2–

---------------------<

εi Z

p Z( ) Φ α w– Z⋅

1 w2–

---------------------

=

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 122: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 16CreditMetrics® Monitor

April 1999

We refer to as the conditional default probability. Chart 1 illustrates the effect of different values on the conditional asset distribution, and on the conditional default probability. Observe thatwhen is negative (the market factor decreases, indicating a bad year), the obligors are closer todefaulting, and their conditional default probability is greater; when is positive, the opposite istrue.

Chart 1Unconditional asset distribution, and conditional distributions with positive and negative .

The strength of the dependence on is a function of the index weight . When is close to one(meaning asset correlations are high), the conditional default probability is most affected by the mar-ket; when is lower, more of the obligor randomness is due to the idiosyncratic term, and the con-ditional default probability is less affected. At the extreme, when is zero, there is no dependenceon the market, and the conditional default probability is always just , regardless of the value of .See Chart 2.

Before moving on to our new portfolio approaches, we point out that the conditional framework pro-vides us with a convenient way to decompose and compute the variance of the portfolio. In general,we may decompose the variance of any random variable as

[4] .

For our case, the conditional mean of the portfolio value (recall that each loan is either worth zeroor ) is , the variance of which is just the variance of . Since the expectation of

is , the variance is . To compute the first term, we evaluate the integral4

4 See Vasicek (1997) for this and other details of the distribution of . In the article, this distribution is referred to as the normal inverse.

p Z( ) Z

ZZ

Z

-3 -2 -1 0 1 2 3Market factor

0.1

0.2

0.3

0.4

0.5

Relative frequency

Unconditional

Cond., Z=-2

Cond., Z=2

Z w w

ww

p Z

Variance Variance of conditional mean( ) Mean of conditional variance( )+=

V1 N⁄ 1 p Z( )–( ) p Z( )

p Z( ) p E p Z( )2[ ] p2–

p Z( )

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 123: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 17CreditMetrics® Monitor

April 1999

[5] ,

where is the bivariate normal CDF with correlation . Thus, the first term in Eq. [4],the variance of the conditional portfolio mean, is equal to . We may think of thisas the portfolio variance that is due to moves in the market factor.

Chart 2Conditional default probability as a function of the market factor.

To compute the second term in Eq. [4], we utilize the conditional independence of the individual ob-ligors. In particular, given , defaults are independent, and occur with probability . Thus, theconditional variance of the value of individual loans is .5 Since the loan valuesare conditionally independent, the conditional portfolio variance is the sum of the individual condi-tional variances, or . Taking expectations and using Eq. [5] again, we see that themean of the portfolio conditional variance is . Putting everything together, wesee that the portfolio variance is given by

[6] .

5 The term is due to the size of the individual loans.

E p Z( )2[ ] φ z( ) p z( )2 zd⋅∞–

∫ φ z( ) Φ α w– z⋅

1 w2–

-------------------- 2

zd⋅∞–

∫ Φ2 α α w2;,( )= = =

Φ2. . w2;,( ) w2

Φ2 α α w2;,( ) p2–( )

-3 -2 -1 0 1Market factor

1%

2%

3%

4%

5%

6%

7%

Cond. def. prob.

w=0% w=10%

w=20% w=30%

Z p Z( )p Z( ) 1 p Z( )–( ) N2⁄⋅

N2

p Z( ) 1 p Z( )–( ) N⁄⋅p Φ2 α α w2;,( )–( ) N⁄

Φ2 α α w2;,( ) p2–( ) p Φ2 α α w2;,( )–( ) N⁄+

variance based on market idiosyncratic variance

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 124: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 18CreditMetrics® Monitor

April 1999

In Chart 3, we observe the percentage of portfolio variance that is due to the market, as a function ofcorrelation level and portfolio size. As we would expect, for higher correlations and larger portfolios,more of the portfolio variance is explained by the market.

Chart 3Percentage of portfolio variance due to market moves.

Three methods to obtain the portfolio distribution

In this section, we use our simple example and the conditional setting described above to illustratethree alternatives to the current Monte Carlo approach. We begin with a method that relies on a roughapproximation, but that can be implemented in a simple spreadsheet. The next two methods rely onsuccessively weaker assumptions, but require more involved numerical techniques.

The Law of Large Numbers (LLN) method6

The Law of Large Numbers is a fundamental result in probability that states, loosely, that the sum ofa large number of independent random variables will converge to its expectation. For instance, thisis the result that guarantees that, as we continue to flip the coin more and more times, the proportionof flips which result in heads will converge to one half. Likewise, we may apply this result to ourportfolio once we have conditioned on . That is, we suppose that our portfolio is large enough sothat once we have fixed the market variable, the proportion of obligors that actually defaults is ex-actly , and that the portfolio value is equal to its conditional mean, . In effect, we as-sume that the portfolio distribution is just the distribution of , and that the second term inEq. [6] is negligible. Note that the plots in Chart 3 serve as a guide to how valid this assumption

6 This subsection draws significantly from Vasicek (1997), which provides a more rigorous treatment of the limit arguments than is presented here.

0% 10% 20% 30% 40% 50% 60%w

20%

40%

60%

80%

100%

% of variance

N=100 N=200

N=500 N=1000

Z

p Z( ) 1 p Z( )–( )1 p Z( )–( )

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 125: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 19CreditMetrics® Monitor

April 1999

might be. Intuitively, this assumption is that of “perfect diversification in size”; the portfolio only hasrisk due to its sensitivity on the market factor, but not due to having large exposures to any obligor.

This most desirable feature of the LLN method is that it is trivial to compute percentiles of the port-folio distribution. Suppose we wish to find the 1st percentile portfolio value (that is, the 99% worstcase value). Since we have assumed that the portfolio value only depends on , and this dependenceis monotonic, ’s percentile maps to the portfolio percentile in a one-to-one fashion. Thus, since the1st percentile of is -2.33, the 1st percentile portfolio value is . Using this same logic,we see that for a particular value , the probability that the portfolio is worth less than is given by

[7] .

Differentiating the expression above with respect to , we obtain the probability density for the port-folio:

[8] .

We present plots of the portfolio density and CDF in Charts 4 and 5.

Chart 4Portfolio density function using the LLN method.p=5%, w=50%.

ZZ

Z 1 p 2.33–( )–( )v v

Pr V v<{ } Pr 1 p Z( )– v<{ } Φ α 1 w2–– Φ 1– 1 v–( )⋅w

------------------------------------------------------------ = =

v

1 w2–w

-------------------φ α 1 w2–– Φ 1– 1 v–( )⋅

w------------------------------------------------------------

φ Φ 1– 1 v–( )( )---------------------------------------------------------------------⋅

90% 92% 94% 96% 98% 100%Portfolio value

5

10

15

20

25

30

Relative frequency

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 126: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 20CreditMetrics® Monitor

April 1999

Chart 5Portfolio CDF using the LLN method.p=5%, w=50%.

Returning to the two drawbacks of the standard Monte Carlo framework, we see that this method cer-tainly addresses the first (computational complexity). On the other hand, while the avoidance of anyMonte Carlo techniques guarantees that the method is not subject to simulation error, there is a sig-nificant chance of error due to the method’s strong assumptions. This model error is certainly enoughto suggest not using this method for capital calculations, yet it is difficult to ignore the model’s sim-plicity.

One potential application of the LLN method is for sensitivity testing. Since the computation of per-centiles is so straightforward, it is simple to investigate the effect of increasing correlation levels ordefault likelihoods. Furthermore, even with more complicated portfolio distributions, it is possibleto use the LLN method for quick sensitivity testing by first calibrating the two parameters, and (we have assumed away any dependence on ), such that the mean and variance of the LLN meth-od’s distribution match the mean and variance of the more complicated distribution. It is then rea-sonable to treat as an effective correlation parameter, and investigate the sensitivity of percentilelevels to changes in this parameter. In this way, we use the distribution of as a simple twoparameter family which is representative of a broad class of credit portfolios. In principle, this is sim-ilar to Moody’s use of the “diversity score” in their ratings model for collateralized bond and loanobligations (see Gluck (1998)).

We have seen that the conditional approach can provide useful, quick, but rough approximations ofthe credit portfolio distribution. While this is helpful, we would like to exploit the technique forsomething more advanced than “back of the envelope” calculations, and in the next section, move toless rigid assumptions and a more involved technique.

80% 85% 90% 95% 100%Portfolio value

20%

40%

60%

80%

100%

Cumulative probability

p wN

w1 p Z( )–( )

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 127: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 21CreditMetrics® Monitor

April 1999

The Central Limit Theorem (CLT) method

Loosely again, the Central Limit Theorem establishes that (appropriately normalized) sums of largenumbers of independent random variables are normally distributed. It is said that because so manyof the quantities we encounter - stock price changes over a year, heights of people, etc. - are the resultof sums of independent random variables, the CLT is the explanation for why the normal distributionis observed so frequently.

In our case, we use the CLT to justify the assumption that given , the conditional portfolio distri-bution is normally distributed. Thus, in contrast to the LLN method, where we assumed that given

, there was no variance, and therefore no randomness in the portfolio, here, we assume that the con-ditional portfolio variance is as stated in Eq. [6], but that the shape of the conditional portfolio dis-tribution is normal.7 To contrast the LLN and CLT approaches, consider Chart 6. In the LLNapproach, we draw , which uniquely determines the portfolio value , indicated by theheavy line in the chart. Note that there is no dependence on portfolio size. In the CLT approach, wedraw , and then assume that the portfolio is normally distributed, with mean given by the heavyline, and standard deviation bands given by the appropriate lighter lines. Here, there is a dependenceon , and further, we match two moments of the model distribution, whereas in the LLN approach,we only match the expectation.

Chart 6Conditional portfolio distribution as a function of .

The advantage of the CLT assumption is that conditional on the market factor, the portfolio distribu-tion is tractable. Specifically, given , we know the portfolio’s conditional mean, , and con-ditional variance, , and can then write the conditional probability that theportfolio value is less than some level by

7 The true shape of the conditional portfolio distribution is binomial.

Z

Z

Z 1 p Z( )–( )

Z

N

Z

-2 -1.5 -1 -0.5 0 0.5 1Market factor

75%

80%

85%

90%

95%

100%

Portfolio value

Conditional mean

SD bands, N=100

SD bands, N=1000

Z 1 p Z( )–p Z( ) 1 p Z( )–( ) N⁄⋅

V v

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 128: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 22CreditMetrics® Monitor

April 1999

[9] ,

where denotes the conditional probability given . To compute the unconditional probabilitythat the portfolio value falls below , we take the expectation of the expression in Eq. [9]:

[10] .

Although this integral is intractable analytically, it can be evaluated numerically by standard tech-niques. Thus, we can compute the entire portfolio distribution, which we present along with the LLNresults in Chart 7 and Chart 8. We also present selected percentile values in Table 1. In both the chartand the table, we present CLT results for two choices of portfolio size ; recall that in the LLN meth-od, we do not account for , and therefore our results will be the same for all portfolio sizes.

Chart 7Portfolio density function for CLT and LLN methods.p=5%, w=50%.

PrZ V v<{ } Φ v 1 p Z( )–( )–

p Z( ) 1 p Z( )–( ) N⁄⋅-----------------------------------------------------

=

PrZ Zv

Pr V v<{ } E PrZ V v<{ }[ ] φ z( ) Φ⋅∞–

∫ v 1 p z( )–( )–

p z( ) 1 p z( )–( ) N⁄⋅---------------------------------------------------

dz= =

NN

90% 92% 94% 96% 98% 100%Portfolio value

5

10

15

20

25

Relative frequency

CLT, N=50

CLT, N=200

LLN

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 129: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 23CreditMetrics® Monitor

April 1999

Chart 8Portfolio CDF for CLT and LLN methods.p=5%, w=50%.

Table 1Percentile values for CLT and LLN methods.p=5%, w=50%.

We make two general observations. First, for larger values of , the CLT and LLN methods givemore similar results; this is sensible, since it is at these values that the LLN assumption of zero con-ditional variance is more accurate. Second, the discrepancies between the two methods tend to begreater at more extreme percentile levels. This is also to be expected, as the more extreme percentilescorrespond to market factor realizations where the conditional default probability, and therefore alsothe conditional portfolio variance, is greater than at less extreme percentiles.

In the end, the CLT approach is a step up in complexity from the LLN case, in that we need to eval-uate the integral in Eq. [10] to obtain the cumulative probability distribution for the portfolio. How-ever, the complexity in the computation of this integral is still far less than that of the standard MonteCarlo approach. Assuming we used the same number of sample points (say 1,000) in our evaluationof Eq. [10] as we use scenarios in the Monte Carlo procedure8, the number of points necessary forMonte Carlo is still times greater than that for the CLT integration. Thus, where the Monte Carlo

Percentile CLT, N=50 CLT, N=200 LLN

50% 97.2% 97.2% 97.1%

10% 86.7% 87.4% 87.7%

5% 81.5% 82.5% 82.9%

1% 68.9% 70.5% 71.1%

10bp 51.4% 53.8% 54.6%

4bp 45.1% 47.6% 48.5%

60% 65% 70% 75% 80%Portfolio value

1%

2%

3%

4%

Cumulative probability

CLT, N=50

CLT, N=200

LLN

N

N

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 130: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 24CreditMetrics® Monitor

April 1999

approach treated the portfolio problem as having dimensionality (that is, as having degrees offreedom), the CLT approach (like the LLN approach) treat the problem as having dimensionality 1.This is the key advantage of this and all conditional approaches; we deal explicitly only with the fac-tors that the obligors have in common, and use approximations or closed form techniques to handlethe factors that are idiosyncratic to the individual obligors.

On the other hand, the CLT approach is still subject to approximation errors, although less so thanthe LLN approach. In the LLN case, we guaranteed that one moment of the portfolio distributionwould be exact, and assumed away the idiosyncratic variance; in the CLT case, we guarantee that thefirst two moments of the portfolio distribution are exact, and make our assumptions about the shapeof the distribution given its variance. The price we pay (other than increased complexity) for this bet-ter approximation is the possibility of error in the numerical evaluation of the integral in Eq. [10].

The Moment Generating Function (MGF) method9

The last step in our hierarchy is to admit slightly more complexity, while modeling the entire portfo-lio distribution exactly. We will rely on the fact that a distribution is uniquely characterized by itsmoment generating function (MGF). That is, if for two random variables and and all values of ,10 then and must have the same probability distribution. Intuitively, all ofthe information about a random variable is contained in the MGF . This is plausibleif we observe that the mean of is equal to the first derivative of evaluated at , the expectedvalue of is equal to the second derivative of evaluated at ; and in general, the th mo-ment of is equal to the th derivative of evaluated at .11 Therefore, if we can obtain theMGF of our portfolio value , then in principle, we have obtained the portfolio distribution. In prac-tice, obtaining the distribution from the MGF can be involved numerically, yet in cases (such as ours,as we will see) where the MGF is a straightforward computation, the approach is very appealing.12

To compute the MGF for our portfolio distribution, we again rely on the conditional independenceof the individual obligors. Letting denote the values of the individual loans, and the expectation conditional on , we have that

[11] .

8 This actually is an extreme case, as we can obtain the same precision in the evaluation of Eq. [10] with a hundred sample points as we would for the Monte Carlo procedure with several thousand sample points.

9 The author wishes to thank Andrew Ulmer for help with the computations for this method.

10 It is only necessary that there exists an interval containing such that the equality holds for all in this interval.

11 See DeGroot (1986) or any standard probability text.

12 For integer-valued random variables (for example, the number of obligors which default in a given period), it is more common in practice to use the probability generating function, , from which the probability distribution is simpler to obtain. This is the approach used by Nagpal and Bahar (1999). It would be reasonable to use this approach here, but we use the MGF approach to ease our generalizations later.

N N

E et X⋅ E et Y⋅= X Yt X Y

0 t

X ψ t( ) E et X⋅=X ψ t 0=

X2 ψ t 0= nX n ψ t 0=

V

E αX

V1 V2 … VN, , , EZZ

EZ et V⋅ EZ exp t V1 … VN+ +( )⋅[ ] EZ et V1⋅ … et VN⋅⋅ ⋅( )= =

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 131: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 25CreditMetrics® Monitor

April 1999

Now since the are conditionally independent and have the same stand alone distributions, we maywrite the right-hand side of Eq. [11] as the product of individual moment generating functions, andobtain

[12] .

To complete the calculation of the conditional portfolio MGF, we recall that, conditional on , takes on the value with probability and with probability . Thus the conditionalMGF for is

[13] ,

and the conditional portfolio MGF is

[14] .

Finally, to obtain the unconditional MGF, we take the expectation of Eq. [14] and obtain

[15] .

As was the case with Eq. [10], we must rely on numerical techniques to evaluate this integral. Notethat in this setting, the CreditMetrics model looks similar to the CreditRisk+ model, with the onlynotable difference being the shape of the distribution of the conditional default probability .13

To obtain the portfolio distribution from Eq. [15], we apply the standard Fast Fourier Transform(FFT) inversion (see Press et al (1992)). This technique requires sampling Eq. [15] at values of spaced equally around the unit circle in the complex plane, that is, at , for

, where for fastest results, is an integer power of .14 Applying this techniquerequires more operations than does the CLT approach, but there are no approximation errors otherthan the possible error in evaluating Eq. [15] and the similar error resulting from using a finite num-ber of sample points for the FFT.

We present the results for all three techniques in Chart 9 and Table 2. There is very little discrepancybetween the three methods, and at more extreme percentiles, the MGF approach seems to agree moreclosely with the LLN. Checking our results with 10,000 Monte Carlo simulations, we see that it isimpossible to conclude that any of the three methods gives incorrect results. Thus, even if the choicebetween the methods is still unclear, we can be certain that all three provide significant improvementsover the slower Monte Carlo approach.

13 Gordy (1998) provides further details.

14 In our examples, we typically set .

Vi

EZ et V⋅ EZ et V1⋅ EZ et V2⋅ … EZ et VN⋅⋅ ⋅⋅ EZ et V1⋅( )N= =

Z V11 N⁄ 1 p Z( )– 0 p Z( )

V1

EZ et V1⋅ 1 p Z( )–( ) et N/ p Z( ) et 0⋅⋅+⋅ et N/ 1 p Z( ) 1 e t N/––( )⋅–[ ]⋅= =

EZ et V⋅ et 1 p Z( ) 1 e t N/––( )⋅–[ ]N⋅=

E et V⋅ et φ z( ) 1 p z( ) 1 e t N/––( )⋅–[ ]N⋅∞–

∫ dz⋅=

p Z( )

tt 2πik m⁄=

k 0 1 … m 1–, , ,= m 2

m 128=

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 132: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 26CreditMetrics® Monitor

April 1999

Chart 9Portfolio density function using MGF, CLT, and LLN methods.p=5%, w=50%, N=200.

Table 2Percentile levels using MGF, CLT, and LLN methods.p=5%, w=50%, N=200.

To this point, we have been able to avoid Monte Carlo simulations entirely, and to exploit the struc-ture of the CreditMetrics model in a simple case to analytically estimate the portfolio distribution.Unfortunately, as we stray from our simple case (in particular, as we allow more market factors), theanalytical techniques are no longer practical. However, we may still exploit the conditional frame-work by using the techniques we describe above, and sampling the factors through a Monte Carloprocedure. This is the subject of the next section.

Extensions of the basic case

The three strong assumptions we used in the previous section were:

• only one market factor drove all of the asset value processes,

Percentile MGF CLT LLN50% 97.0% 97.2% 97.1%

10% 87.4% 87.4% 87.7%

5% 82.5% 82.5% 82.9%

1% 70.6% 70.5% 71.1%

0.10% 54.4% 53.8% 54.6%

0.04% 48.6% 47.6% 48.5%

90% 92% 94% 96% 98% 100%Portfolio value

5

10

15

20

25

Relative frequency

MGF

CLT

LLN

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 133: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 27CreditMetrics® Monitor

April 1999

• there were only two possible credit states, and

• all of our exposures were identical - That is, every exposure had the same size, default prob-ability, recovery rate, and weight on the market factor.

In this section, we will first present the framework and notation for a more general case, then discusshow our three earlier methods can be extended into a Monte Carlo setting, and finally present resultsfor an example portfolio.

Framework and notation

For our more general case, we assume that there are two market factors (the extension to any arbitrarynumber will be clear) and that these two factors are independent.15 Denote the two market factors and . Let denote the possible rating states, with corresponding to the highestrating and corresponding to default. For , let the th obligor be characterized by:

• weights on the th factor, meaning that the obligors asset value change is expressed as

[16] ,

• probabilities of being in the th rating state at the horizon, and

• values , which our exposure to the th obligor will be worth if the obligor is in rating state at the horizon.

Rather than the single default threshold from the previous section, we now have multiple thresh-olds that are specific to the individual obligors. Let denote the rating thresholds forthe th obligor, such that the obligor will be in (non-default) rating state if and ,and will default if . We calibrate the thresholds as in Chapter 8 of the Technical Document,such that the probabilities that falls between them correspond to the obligor’s transition probabil-ities. Thus, we set , and

[17] .

Analogously to the previous section, we observe that given the values of and , the obligor tran-sitions are conditionally independent. Let denote the conditional probability that the thobligor migrates to rating , given a realization of and . Noting that the condition that isless than a threshold becomes

[18] ,

15 Both Gordy (1998) and Kolyoglu and Hickman (1998) point out that the assumption of independence is not restrictive, since two correlated market factors may be expressed as linear combinations of two independent factors.

Z1

Z2 r 1 … m, ,= r 1=r m= i 1 … N, ,= i

wi k, k

Zi wi 1, Z1 wi 2, Z2⋅ 1 wi 1,2

wi 2,2–– εi⋅+ +⋅=

pir r

vir i

r

ααi

r, r 1 … m, ,=i r Zi αi

r< Zi αir 1+≥

Zi αim<

Ziαi

1 ∞=

αir Φ 1–

pis

s r=

m

, r 2 … m, ,= =

Z1 Z2

pir Z1 Z2,( ) i

r Z1 Z2 Ziαi

r

εi

αir wi 1, Z1 wi 2, Z2⋅–⋅–

1 wi 1,2

wi 2,2––

----------------------------------------------------------<

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 134: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 28CreditMetrics® Monitor

April 1999

we obtain the conditional default probability as before:

[19] ,

and the conditional transition probabilities similarly:16

[20] .

With the conditional transition probabilities in hand, we have all of the information necessary to de-scribe the conditional portfolio distribution. For instance, the conditional mean is simply

[21] .

The conditional variance and conditional MGF calculations are also straightforward:

[22] , and

[23] .

We are now ready to discuss conditional Monte Carlo methods for the general case.

Conditional Monte Carlo using the LLN method

Recall from before that the key assumption for the LLN technique was that the conditional varianceis negligible, and that given the market factors, the portfolio value is exactly equal to its conditionalmean. Then since the conditional mean is a strictly increasing function of the market factor, we couldobtain percentiles for the portfolio by calculating the percentile for the market factor, and calculatingthe conditional mean given this value.

In the general case, our assumption is the same: given the market factors, the portfolio value is ex-actly . However, since we now have multiple market factors, we cannot simply carry overfactor percentiles to obtain portfolio percentiles. Rather, to arrive at the distribution of , weapply a Monte Carlo approach, generating a number of realizations of the two market factors, and

16 This procedure was discussed for the one factor case by Belkin, Suchower, and Forrest (1998).

pim Z1 Z2,( ) Φ

αim wi 1, Z1 wi 2, Z2⋅–⋅–

1 wi 1,2

wi 2,2––

------------------------------------------------------------

=

pir Z1 Z2,( ) Φ

αir wi 1, Z1 wi 2, Z2⋅–⋅–

1 wi 1,2

wi 2,2––

----------------------------------------------------------

Φαi

r 1+ wi 1, Z1 wi 2, Z2⋅–⋅–

1 wi 1,2

wi 2,2––

-----------------------------------------------------------------

, r– 1 … m 1–, ,= =

m Z1 Z2,( ) pir Z1 Z2,( ) vi

r⋅r 1=

m

∑i 1=

N

∑=

σ2 Z1 Z2,( ) pir Z1 Z2,( ) vi

r m Z1 Z2,( )–( )2⋅r 1=

m

∑i 1=

N

∑=

EZ1 Z2, et V⋅ EZ1 Z2, et Vi⋅

i 1=

N

∏ pir Z1 Z2,( ) et vi

r⋅⋅r 1=

m

∑i 1=

N

∏= =

m Z1 Z2,( )m Z1 Z2,( )

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 135: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 29CreditMetrics® Monitor

April 1999

evaluating the conditional mean in each case. We then estimate percentiles of the portfolio throughthe simulated percentiles of the conditional mean.

Conditional Monte Carlo using the CLT method

For the CLT technique, our assumption in the general case does not change either. We assume thatgiven a realization of the market factors, the portfolio distribution is conditionally normal, with mean

and variance given by Eq. [21] and Eq. [22], respectively. Given this assump-tion, the probability that the portfolio value falls below a particular level is given by a generaliza-tion of the integral in Eq. [10]:

[24] .

With only two market factors, it may still be practical to evaluate this integral numerically; however,as the number of factors increases, Monte Carlo methods become more attractive. The Monte Carloprocedure is as follows:

1. Generate pairs of independent, normally distributed random vari-ates.

2. For each pair , compute the conditional mean and variance.

3. Estimate the portfolio mean by and the portfolio variance by

[25] .

4. Estimate the cumulative probability in Eq. [24] by the summation:

[26] .

Note that in step 3, we do not rely on any distributional assumptions, and admit only the error fromour Monte Carlo sampling. In step 4, we utilize the CLT assumption, and our estimation is subject toerror based on this as well as our sampling.

Conditional Monte Carlo using the MGF method

Our application of the MGF technique to the general case is similar. The generalization of Eq. [15],using the result in Eq. [23], is

m Z1 Z2,( ) σ2 Z1 Z2,( )v

Pr V v<{ } E PrZ1 Z2, V v<{ }[ ] φ z1( ) φ z2( ) Φ⋅ ⋅∞–

∫v m z1 z2,( )–

σ2 z1 z2,( )------------------------------

dz1 z2d∞–

∫= =

n z11 z1

2,( ) z21 z2

2,( ) … zn1 zn

2,( ), , ,

zj1 zj

2,( ) mj m zj1 zj

2,( )=σj

2 σ2 zj1 zj

2,( )=

m̂ mjj

∑ n⁄=

σ̂2 1n--- mj m̂–( )2

j 1=

n

∑⋅ 1n---+ σj

2

j 1=

n

∑⋅=

Pr V v<{ } 1n--- Φ

v mj–

σj2

--------------

j∑⋅=

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 136: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 30CreditMetrics® Monitor

April 1999

[27] .

We implement a Monte Carlo approach to sample pairs as before, evaluate the conditionalMGF (the term in parentheses) in Eq. [27] for each pair, and aggregate the results to obtain an esti-mate of . Obtaining the MGF for an adequate set of values allows us to apply the FFT in-version technique described earlier. This technique gives us the entire portfolio distribution, witherrors due only to sampling, but not to any assumptions.

Example

To illustrate the performance of our three proposed methods, we analyze an example portfolio. Theportfolio is comprised of exposures to 500 obligors, the ratings of which are distributed as in Table 3.The exposures are well diversified by size (no single exposure accounts for more than 1% of the total)and depend on two market factors; the dependence on the factors is such that the average asset cor-relation between obligors is about 30%. The portfolio has a mean value of 248.9 and a standard de-viation of 2.56.

Table 3Distribution of obligors across ratings.Example portfolio.

We apply the three methods using 625 trials in each case; for the MGF method, we use 256 values of.17 In addition, we perform a standard Monte Carlo simulation using 50,000 scenarios. The results

are presented in Charts 10 and 11 and Table 4.

RatingPercent of obligors

Default prob. (%)

AAA 10 0.02

AA 20 0.04

A 30 0.05

BBB 30 0.17

BB 6 0.98

B 4 4.92

17 This corresponds to an inverse FFT with order eight.

E et V⋅ φ z1( ) φ z2( ) pir z1 z2,( ) et vi

r⋅⋅r 1=

m

∑i 1=

N

⋅ ⋅∞–

∫ dz1 z2d∞–

∫=

zj1 zj

2,( )

E et V⋅ t

t

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 137: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 31CreditMetrics® Monitor

April 1999

Chart 10PDF for example portfolio.

Chart 11CDF for example portfolio.

240 242 244 246 248 250 252 254Portfolio value

10%

20%

30%

40%

Relative frequency

CLTMGFMonte Carlo

220 225 230 235 240 245Portfolio value

1%

2%

3%

4%

5%

6%

Cumulative probability

CLT

MGF

Monte Carlo

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 138: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 32CreditMetrics® Monitor

April 1999

Table 4Percentile levels for example portfolio.Monte Carlo results represent upper and lower confidence estimates.

Our first observation is that all of the results are quite similar. In particular, there appears to be verylittle difference between the CLT and MGF results, suggesting that the assumption of conditionalnormality is appropriate, at least for this example. This is an encouraging result and would providesignificant memory and computational savings if it holds generally. Even more encouraging is thatwith only 625 samples of the market factors, we obtain (with CLT and MGF) a good estimate of eventhe four basis point (one in 2,500) percentile. Precision in the more extreme percentiles is a drawbackfor the LLN method (and standard Monte Carlo), as we use order statistics as estimates18. Thus, tocapture the four basis point percentile, we would need more than 2,500 trials; to capture the 1 basispoint percentile, we would need over 10,000. If we still need to use as many trials as in the standardMonte Carlo case, there is little reason to use conditional simulations. For this reason, LLN in thegeneral case is of little use.

With regard to the cost of the computations, the times in Table 4, while not representing optimizedimplementations, serve as good indicators of the relative benefits of performing conditional simula-tions. Using CLT or MGF, we obtain the same precision as with Monte Carlo, with up to a factor offorty improvement in speed. However, as the number of market factors increases, so does the sensi-tivity of the MGF and CLT results to the sample factor points. Further investigation is necessary intohow many market factors can be accommodated using these methods, and whether sophisticatedsampling techniques might allow these methods to be extended to more factors.

In addition, further investigation into other sensitivities of the conditional methods is necessary. Inparticular, we have assumed that the portfolios do not have significant obligor concentrations (suchas the case where one obligor in a portfolio of accounts for 10% of the total exposure). The LLNand CLT methods especially would be subject to more error in the presence of significant concentra-tions. Nevertheless, it is likely that for all but the most pathological cases, the error in our simulationmethods are significantly less than the errors due to uncertainty in the model parameters.

Monte CarloPercentile LLN CLT MGF Lower Upper

50% 249.3 249.3 249.3 249.4 249.4

10% 246.4 246.4 246.4 246.4 246.5

5% 244.8 244.7 244.7 244.3 244.5

1% 238.0 238.9 239.1 237.6 238.4

10bp 220.6 221.1 222.0 223.3 225.5

4bp 220.6 219.5 220.0 217.8 222.2

Time (sec) 36 46 263 1793

18 For example, the estimate for the 1st percentile using Monte Carlo (and 50,000 trials) is the 500th smallest value.

200

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 139: Selection of Research Material relating to RiskMetrics Group CDO Manager

page 33CreditMetrics® Monitor

April 1999

Conclusion

We have discussed a number of possibilities for improving upon the standard Monte Carlo approachused currently in CreditMetrics. All of the possibilities are based on the observation that given themoves in the market factors that drive our obligors, the individual obligor credit moves are condi-tionally independent. This observation opens the door for us to utilize the many techniques and as-sumptions that are only applicable to the independent case. Further, this observation illustrates thatthe real dimensionality (that is, the number of independent variables that we must treat) of the Cred-itMetrics problem is not the number of obligors, but the number of market factors. Utilizing thesefacts, we obtain preliminary results that suggest that our new approaches provide the same accuracyas standard Monte Carlo in a fraction of the time. Continued investigation is planned.

References

Belkin, Barry, Stephan Suchower, and Lawrence Forest. “A One-Parameter Representation of CreditRisk and Transition Matrices,” CreditMetrics Monitor, Third Quarter, 1998.

DeGroot, Morris H. Probability and Statistics, 2nd ed. (Reading, MA: Addison-Wesley PublishingCompany, 1986).

Finger, Christopher C. “Sticks and Stones,” The RiskMetrics Group, LLC, 1998.

Gluck, Jeremy. “Moody's Ratings of Collateralized Bond and Loan Obligations,” Conference onCredit Risk Modelling and the Regulatory Implications, Bank of England, 1998.

Gordy, Michael B. “A Comparative Anatomy of Credit Risk Models,” Federal Reserve Board FEDS1998-47, December, 1998.

Koyluoglu, Ugur, and Andrew Hickman. “Reconcilable Differences,” Risk, October 1998.

Nagpal, Krishan M. and Reza Bahar. “An Analytical Approach for Credit Risk Analysis Under Cor-related Defaults,” CreditMetrics Monitor, April, 1999.

Press, William H., Saul A. Teukolsky, William T. Vetterling, and Brian P. Flannery. Numerical Rec-ipes in C, 2nd ed. (Cambridge: Cambridge University Press, 1992).

Vasicek, Oldrich. “The Loan Loss Distribution,” KMV Corporation, 1997.

Conditional Approaches for CreditMetrics Portfolio Distributions

Page 140: Selection of Research Material relating to RiskMetrics Group CDO Manager

CDO Research Pack Version 1.0 5th September 2000

For more information on RiskMetrics CDO Software, CreditManager or CreditMetrics, please contact your RiskMetrics Representative, or contact us as below: T: +44 (0) 20 7842 0260 F: +44 (0) 20 7842 0269 W: www.riskmetrics.com E: [email protected]