Top Banner
Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010
45

Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

Dec 27, 2015

Download

Documents

Madison Mason
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

Modelling the CRM for the Correlation Trading Portfolio

Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland

May 19, 2010

Page 2: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

2

Agenda

Regulatory Requirements

Challenges in Meeting Regulatory Requirements

RBS Approach to CRM Calculation

Modelling Approaches and Assumptions

Price Risk

• Simulation of Market

Default Risk

Appendix

• Computational Implementation of CRM

Page 3: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

3

The All Price Risk Measure represents a special form of the Incremental Risk Charge, described in 7.10.55S R (1) for positions in the correlation trading book

The “All Price Risk Measure”, must

• Adequately capture all price risks at the 99.9% confidence interval over a capital horizon of one year

• Under the assumption of a constant level of risk

• And be run at least weekly

Price risk measures include:

• Defaults, including the ordering of defaults;

• Credit spread risk;

• Volatility of implied correlations, including the cross effect between spreads and correlations;

• Index to single names basis and implied correlation of an index to bespoke portfolios basis;

• Recovery rate volatility;

• Risk dynamic hedging and the cost of rebalancing;

• Though interest rate and foreign exchange risk have not been explicitly mentioned, we consider this to included in “All Price Risk”

Regulatory Requirements

Page 4: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

4

Agenda

Regulatory Requirements

Challenges in Meeting Regulatory Requirements

RBS Approach to CRM Calculation

Modelling Approaches and Assumptions

Price Risk

• Simulation of Market

Default Risk

Timing and Next Steps

Appendix

Computational Implementation of CRM

Page 5: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

5

Naive Implementation of CRM

Naively implementing the CRM i.e., computing the 99.9% worst loss on a 1 year horizon on RBS’s entire correlation trading portfolio is very difficult

For example, using Monte Carlo, we would need to evolve the market forwards in time, pricing and hedging the portfolio as per the desk, tracking P&L over a 1 year horizon

A back of the envelope calculation immediately reveals the high likelihood of failure:

Figure 1: Numbers and types of trades in our portfolio along with representative times to compute PV and risks for one trade on one computer

Trade Type Trade Count

Bespoke ~500

Nth To Default ~500

Index CDS ~3000

Index tranches ~3000

Single name CDS c. 75,000

Computation Timing (seconds)

PV ~10

Parallel CR01 ~200

Default Delta ~60

Recovery Delta ~200

Base Correlation Sensitivity ~60

Page 6: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

6

Naive Implementation of CRM (cont’d)

Assuming a rehedging frequency of once every month and a grid of 300 computers and the minimum number of paths to compute the 99.9% confidence limit (i.e., 1,000) we see that we would need ~ 2,600 hours to compute results for just the bespoke CDOs

Recalibration of the market (which needs to happen for every valuation and hedging time point) adds substantially to this timing

Over the next few slides we highlight:

• How one might address the core issue of computational intractability

• Issues in simulating the market

• Subjectivity of hedging

We will in effect pose a series of questions; the decisions that we have made form the basis of the RBS approach to computing the CRM. This will be discussed in more detail in the following section.

Page 7: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

7

Possible Areas of Optimisation: Pricing Algorithms

Choice of Algorithm Can we use convolution ?

Importance sampling for the Monte Carlo

Replace Recursion ?

The 1 factor Gaussian copula (with random recovery) is very popular because rapid computational schemes exist. The ASB (or variants thereof) algorithm is commonly used in the industry because it returns (quasi exacts) PV and risks rapidly.

• Faster pricing approaches are well known in the literature; however, these are to some extent (uncontrolled) approximations to the true price.

– LHP (Large Homogeneous Portfolio)– Conditional Gaussian approach (Shelton)– Saddlepoint Methods– Stein

Choice of scheme depends on counterplay between accuracy and speed

Optimisation of the Implementation Parallelisation of the code - currently valuation and risks are computed on a grid. Buy more computers?

• Performance of grids do not necessarily scale linearly - data passing is a limiting factor

Front office pricing code focuses on accuracy: potential speed ups by for example reducing tolerances whilst maintaining high levels of accuracy

Rewriting time critical parts of the code in the assembler?

Page 8: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

8

Possible Approaches: Changing Mapping Approaches

When pricing bespoke tranches within a copula based model, we apply mapping technologies to determine the base correlation for the bespoke - this reflects the different riskiness of the bespoke tranche relative to the index

Loss Fraction (“LF”) mapping (the approach used by RBS and much of the industry) is slow - it requires the inversion of prices to determine correlations

Consider the use of a faster mapping technique such as At the Money (“ATM”) mapping

• RBS front office uses LF mapping to risk manage their correlation book

• LF deltas differ from ATM deltas

• Valuing current portfolio and hedges using ATM mapping rather than LF, will make it appear unhedged

• If we use ATM mapping, we would need to modify RBS’s current portfolio to achieve the same “level of risk” as per LF mapping and then apply a different mapping technique

Page 9: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

9

Possible Approaches: Changing Mapping Approaches

We demonstrate the effect of the different mapping approaches in two scenarios:

• Figure 2 shows the effect of the ATM and LF mapping, when mapping iTraxx S9 to CDX S9. Due to the important differences between the two indices, none of the considered mapping methods produces satisfactory results. However, we note that the LF mapping shifts the market correlation curve in the right direction (as opposed to the ATM mapping)

• Figure 3 shows the effect of the ATM and LF mapping, when mapping iTraxx S9 5 year to 7 year . The two mapping methods produce similar results, with slightly higher correlation values for the LF mapping

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

0.0% 10.0% 20.0% 30.0% 40.0% 50.0% 60.0%Strike

Co

rre

lati

on

ATM Mapping LF mapping CDX S9 iTraxx S9

Figure 2: Mapping iTraxx9 to CDX9 using ATM mapping and LF mapping

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

0.0% 5.0% 10.0% 15.0% 20.0% 25.0% 30.0%Strike

Co

rre

lati

on

ATM mapping LF mapping iTraxx S9

Figure 3: Mapping iTraxx9 5Y to 7Yusing ATM mapping and LF mapping

Page 10: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

10

Subjectivity in Hedging

Typically traders hedge a position in a CDO tranche [a, b] using primarily the constituent CDSs and the index, and sometimes with an additional tranche [l, u]

• Delta hedging movements in the Single Name CDS

• Delta-hedging movements in the index

• Delta and gamma hedging movements in the index

• Hedging parallel shifts in correlation

• Hedging default risk

• Regression based hedging

Traders are free to use some/other of the strategies outlined above; the choice will change depending on market conditions and trader outlook

Algorithmically predicting the hedging strategy is therefore very difficult

Hedging is computationally expensive; furthermore it is very subjective and implementing only a simplistic approach will give rise to greater slippages

Page 11: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

11

Simulation of the Market

Simulating the universe of observed prices relevant to the CDO book forwards by periods up to one year is challenging

We need to model possible movements in yield curves and FX rates

There is a need to capture the dynamics of the market implied CDS spreads to model the price risk. Desiderata for the evolution of the CDS spreads include:

• Impact of rating migrations (jumps?)

• Empirical co-dependence between CDS spreads shows regional and sectoral variation

• Co-dependence between CDS spreads is time dependent - showing regime like behaviour

• Level dependent volatility

Modelling the index tranche market is, if anything, even more challenging

The observed index tranche market comprises:

Given the occurrence of defaults, some of these detachments have changed - e.g., for the high yield the original (0,10%) tranche has been completely wiped out

Page 12: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

12

Simulation of the Market

Typically these index tranche prices are mapped into base correlations using the (random recovery) Gaussian copula. In simulating the market forwards in time, we need to evolve the price / correlation surface

Can we evolve correlations e.g., additively?

• Pretty clear that correlations are bounded between (0, 1)

However, the problem is far more subtle than this: it rapidly becomes clear that an arbitrary set of correlations do not describe a valid set of prices

Applying historical moves in base correlation to the current base correlation curve can lead to arbitrage situations, for example, negative tranche spreads

• In the following graphs, the 3 month move in base correlations from September 2008 to December 2008 is applied to the current base correlation curve to obtain a shifted correlation curve

• As can be seen from the graph on the bottom left, the resulting shifted base correlation curve results in tranche spreads which eventually become negative

Page 13: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

13

Evolving correlations can lead to arbitrage opportunities

0%

10%

20%

30%

40%

50%

60%

70%

80%

0.0% 5.0% 10.0% 15.0% 20.0% 25.0%Detachment

Sep-08 Dec-08

Figure 4 - Historic Base Correlation Moves (iTraxx 5y)

0%

10%

20%

30%

40%

50%

60%

70%

80%

0.0% 5.0% 10.0% 15.0% 20.0% 25.0%Detachment

Spot Shifted

0%

5%

10%

15%

20%

25%

30%

35%

40%

45%

0.0% 5.0% 10.0% 15.0% 20.0% 25.0%Detachment

Spot Shifted

-3%

-1%

1%

3%

5%

7%

9%

10.0% 12.0% 14.0% 16.0% 18.0% 20.0% 22.0% 24.0%Detachment

Spot Shifted

Figure 6 - Base Correlation Tranche Prices

Figure 5 – Historic change applied to spot

Figure 7 - Base Correlation Tranche Prices, zoomed in

Page 14: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

14

Simulation of the Market (cont’d) For the prices of index tranches to be admissible (i.e., for the absence of arbitrage) a set of strong conditions (that have

effectively never violated for the market quoted points) must hold.

Typically these conditions are expressed in terms of the ETL (Expected Tranche Loss), denoted here by:

Intuitively, this is just the price of a European (capped call) option on the loss (More formally we define it as the expected loss on an equity tranche of width K at time T, as seen from time 0).

A number of boundary conditions are immediately apparent:

• An equity tranche cannot lose more than its width i.e.,

• To ensure no arbitrage, the density of the loss distribution must be non-negative for all strikes and times. The ETL is just a normalised price of a call option on the loss; hence the non-negativity of the loss density implies that:

• Losses cannot be reversed - hence the ETL of an equity tranche must be a constant or increasing function of T

Page 15: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

15

Agenda

Regulatory Requirements

Challenges in Meeting Regulatory Requirements

RBS Approach to CRM Calculation

Modelling Approaches and Assumptions

Price Risk

• Simulation of Market

Default Risk

Appendix

• Computational Implementation of CRM

Page 16: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

16

RBS Approach – Disaggregation of CRM calculation into Default and Price Risk

Issues with Simulation1. Unfeasibly large number of computations required to estimate 99.9 th percentile.

2. Calculating hedges is computationally very expensive

3. Hedging strategy is very subjective – dependent upon the market and trader’s view of the future

Definition of Price & Default Risk

• We term Price Risk to be the impact on the portfolio of all moves in the market except for a default event;

• Default Risk is defined to be the impact on portfolio value of default events

• Default events are irreversible; price moves are reversible. Names cannot come back out of default

• Different time horizons for Price Risk and Default Risk:

• We can hedge price risk – hence the time horizon for price risk is dependent on hedge frequency (days to 1 month)

• Defaults have a longer natural timescale – number of defaults in 1 month is minimal

RBS chosen approach: Evaluate Price Risk and Default Risk separately, then aggregate to obtain CRM Constant level of risk allows convolution of Price Risk (up to 1 month for re-hedging) cf. IRC

Reduces number of computations required for Pricing Risk

Removes need for extensive computation of sensitivities and reduces subjectivity in choice of hedging algorithm

Need to evaluate default risk separately – defaults are irreversible. Use Monte Carlo for default risk.

Enables development of an importance sampling algorithm for Default Risk

Is more conservative: double counts defaults combined with large spread move scenarios

Page 17: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

17

Price Risk - Constant Level of Risk

Mathematically speaking, the constant level of risk assumption translates to assuming an identical loss distribution after each time interval, , corresponding to 1/Hedging frequency

i.e., after every hedge interval we are able to re-hedge such that the overall riskiness of RBS’s portfolio is identical to today’s level

Assume = 1Month. Then the constant level of risk P/L distribution over 1 year is the convolution of 12 copies of the 1 Month P/L distribution. This is very powerful:

• We do not need to compute actual hedges, just monthly P/L.

• Convolution allows us to get easily into the tail i.e., to estimate 99.9%

• This leads to significant savings in time – the computation becomes feasible without the need to move away from our books and records valuation approaches (i.e., CRM and desk approaches are consistent)

• Removes the subjectivity in choice of hedging approach

• Obviously convolution cannot be used for defaults (names that default over a month would need to come back out of default !)

Page 18: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Constant Level of Risk - Convolution

18

Page 19: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Convolution lets us get into the tails!

19

Page 20: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

20

Price Risk – RBS Algorithm

1. Choose time-horizon over which portfolio could be re-hedged (2 – 4 weeks)

2. Simulate Market (index tranches, single name CDS yield curves, basis etc) over hedging interval using our historical simulation algorithm (see below)

3. Compute P/L over this period; repeat ~200 – 500 times to compute a distribution

• Use pricing technologies consistent (essentially identical analytics) with those used for books and records valuations

4. Use stressed market scenarios and probability weight (see below) to compute the full 1M P/L distribution.

5. Convolve N times (N = 12 if hedge frequency = 1M) to obtain full P/L distribution over 1Y

• The use of convolution implies the absence of autocorrelation i.e., the 1M P/L distribution is uncorrelated with next month’s P/L distribution

• We will quantify this by examining the impact on price risk of changing the hedging horizon

• From a final number perspective, the impact of autocorrelation will be captured via the use of stressed starting scenarios

Page 21: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

21

Stressed Starting Scenarios

More significantly, however, the constant level of risk assumption implies that (at the end of each hedging interval, despite significant market moves) we are able to re-hedge our CDO portfolio to the same level of riskiness as today

This is a strong assumption. We therefore aim to apply an approach similar to that used for the IRC, where we use stressed starting scenarios

Algorithmically:

• Choose 5 starting scenarios i.e., the market is in one of 5 starting scenarios (each with a weight – the Gauss Hermite weight).

– Our CDO positions will only be partially hedged to this scenario; the cost of this partial hedging will be part of the final P/L distribution

– The starting scenarios will correspond to dates on which the iTraxx, CDX and HY indices assumed the values implied by the Gauss Hermite percentiles

• The market is then evolved as per the algorithm above; the total loss distribution for 1M is computed, accounting for the impact of the stressed scenarios

Hedging

Allow partial (risk based) re-hedging of book when switching to stressed scenarios

Model the relevant cost of re-hedging – based on applicable market bid/offers but also by including a liquidity premium

Page 22: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

22

Stressed Starting Scenarios

Choose stress scenarios to be market on particular days in our history.

Proxy stress events by absolute level of iTraxx spread levels

Choose days in history corresponding to stress events by finding days when quantile of the index matches the probability levels implied by Gauss Hermite.

Page 23: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

23

Agenda

Regulatory Requirements

Challenges in Meeting Regulatory Requirements

RBS Approach to CRM Calculation

Modelling Approaches and Assumptions

Price Risk

• Simulation of Market

Default Risk

Appendix

• Computational Implementation of CRM

Page 24: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

24

Typical Approach in the Industry: Choose a stochastic differential equation (SDE) to describe the market data parameter (e.g., FX)

that we wish to simulate

• Immediately introduces model dependence.

• Estimate the parameters of the SDE (Kalman Filtering)

• Simulate the SDE forwards to generate a possible future time series

Issues – why don’t we do this?

• Strong model dependence – if we estimate a market using a diffusion, we will never predict any jumps!

• Estimation dependent upon quality of history

• Very difficult when we want to simulate a group of inter related variables (e.g. spreads, yield curves, FX, rates) consistently

• Estimation very difficult in the multidimensional case!

• Typically attempt to capture codependence using static correlation; real codependences are far more complex – time dependent and show regimes

• Such an approach will struggle to preserve the shapes of curves (e.g. yield curves)

Price Risk: Simulation of Market Variables

Simulation of the Market

Yield Curve

Single Name Spreads

Index Loss Fractions

Page 25: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

25

Price Risk – Simulation of Market Variables

t0 t1 t2 t3 t4 ...

Δ01market Δ12

market Δ23market

Δt,t+1market = {Δt,t+1

spreads, Δt,t+1FX , Δt,t+1

YC , …}

History

Today’s Market

Apply

H(Δ01market)

Simulation

Jump in

history

H(Δt,t+1market)

25

t0

Derive a time-series of intra-period changes in market variables (FX, Interest rates, etc.)

Historic changes can not be applied to current data directly – define transformation function H()

Apply changes as well as sign-reversal : drift is random, directional correlations are preserved. Change sign of the entire market.

Simulation of the Market

Yield Curve

Single Name Spreads

Index Loss Fractions

Page 26: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

26

RBS uses the Mahal, Rebonato et el. approach: Apply sequence of historical market changes to current market

• Starting date is randomly chosen

• Dates of selected changes must agree across all risk drivers.

• Randomly jump from sequence to a new date

• Randomised trend reversal

• Preserves directional inter-dependence (so, no need to model correlations etc.).

Historical changes must be applicable to current market

• e.g. if current spread is 10bps, not realistic to apply ±100bps historical change

• Transform risk drivers:

Price RiskSimulation of Market Variables

Simulation of the Market

Yield Curve

Single Name Spreads

Index Loss Fractions

y = H [x]

ysim = ytoday + yhist

xsim = H-1[ ysim]

• e.g. for proportional changes: H [x] = ln (x)

• (we use this transformation for FX rates)

• Historical changes should look like ‘white noise’ (not dependent on current market)

Page 27: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

27

Price RiskMarket Simulation – Yield Curves

Simulation of the Market

Yield Curve

Single Name Spreads

Index Loss Fractions

Individual rates Simple CEV-type transformations:

CEV transformation

Source: RBS

xL xR Rate, x

(x

)

],[ if

],[ if

],0[ if

).(1

)( where)(

][

R

RL

L

R

L

xx

xxx

xx

xxC

xx

xx

xxH

To be calibrated: , xL, xR, C

We also have ‘band reversion’ parameters, but not necessary for 1-month changes

Also necessary to check shape of simulated curves

See Mahal et al. ‘Barbell’ effects, shape reversion, etc.

Again, not a problem for short time-horizons

Page 28: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

28

100

AAuto

BEnergy

CMetals

…K

Telco

90

80

20

10

0

Price Risk Market Simulation – Spreads

General Approach The history of individual names not necessarily relevant to modelling spread dynamic of the same

name today (e.g. Ford)

• For obligors that have experienced downgrades or corporate actions, a direct map to its spread history and historical spread change would be unrepresentative of the behaviour it is likely to exhibit today

For any date, bucket names by sector and spread percentile range

For each path (start-date) randomly map each name into a name in the same historical bucket.

Apply the corresponding changes from the mapped name.

• Introduces more randomness and therefore a wider range of plausible outcomes

• Preserves correlations across an industry

• Captures cross-gamma risk concentrated by name

Simulation of the Market

Yield Curve

Single Name Spreads

Index Loss Fractions

Spread Mapping Exercise

Source: RBS

Industrial Sector

Sp

rea

d P

erc

en

tile

Ba

nd

Transformation

Simplest model would be H[x] = ln(x)

(as used in Regulatory Stress Test)

However, we would expect some dependency on current levels of spreads

Maybe similar to Interest Rates

This is work in progress

Page 29: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

29

Simulation of the Market

Yield Curve

Single Name Spreads

Index Loss Fractions

Price RiskMarket Simulation – Index Loss Fractions

General Considerations

Need a different parameterisation of index tranche prices beyond correlation

Simulated prices must be non-arbitrage-able across detachment points and across maturities

RBS is in the process of testing two alternative models, both involving Index Loss Fractions (“ILFs”)

ILFs are effectively the ratio between the Expected Tranche Loss for an equity tranche with strike K to the total expected loss (EL) of the index (i.e., expected tranche loss on an equity tranche with strike 0).

Index Loss fractions – underlie loss fraction mapping

RBS first simulates single name CDS spreads and the basis – we can therefore compute EL.

We then propose to simulate the ILFs (i.e., the above ratios), and then convert to tranche price

][

)]0,[max(

t

tt LE

KLEILF

Page 30: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

30

Simulation of the Market

Yield Curve

Single Name Spreads

Index Loss Fractions

Price RiskMarket Simulation – Loss Fractions Bounds

Loss Fraction Bounds ILFs for any maturity need to be concave functions of detachment point. We model changes so that

simulated ILFs automatically have this property

Simulate equity tranche, and for successively senior tranches find lower and upper bounds for the ILFs, say LB and UB

Define tranche ‘Theta’ as the following ratio:

• [(must be between 0 and 1)

Clearly we cannot just add given the bounds); instead map onto the range (- , ) using inverse normal cumulative distribution, say:

S = -1[] Additive changes in S will therefore always be valid. Are we done?

LBUB

LBILF

Page 31: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Market Simulation – Loss Fractions Bounds

31

Y-axis: base correlation

X-axis: detachment point

Right hand graphs show magnified view of corresponding left hand graph

0-x

0-3

0-7

0-x

0-3

0-7

Simulation of the Market

Yield Curve

Single Name Spreads

Index Loss Fractions

Page 32: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

32

Price RiskMarket Simulation – Loss Fractions Bounds

(8.0)

(6.0)

(4.0)

(2.0)

0.0

2.0

4.0

6.0

0.0% 2.0% 4.0% 6.0% 8.0% 10.0% 12.0% 14.0%

Source: RBS

Figure 9 Change in S Versus Expected Loss - Scaled

(8.0)

(6.0)

(4.0)

(2.0)

0.0

2.0

4.0

6.0

0.0% 2.0% 4.0% 6.0% 8.0% 10.0% 12.0% 14.0%

Figure 8 Change in S Versus Expected Loss - Unscaled

Source: RBS Source: RBS

Simulation of the Market

Yield Curve

Single Name Spreads

Index Loss Fractions

Let us look at a plot of Historical S’s

Figure 8 shows a plot of changes in S for 5-year CDX versus Index Expected Loss

• There is clearly a pattern: as we go to higher expected loss the range over which S can vary decreases

• This effect appears more significant in the data than it is – fewer data points for larger EL

Hence S is not a good quantity to simulate

Figure 9 shows the impact of scaling S by Expected Loss: ie Z = S *G{EL)

G{.} is calibrated to different indices and maturities

• No pattern i.e., apply historical changes in Z to today’s market

Page 33: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

33

Agenda

Regulatory Requirements

Challenges in Meeting Regulatory Requirements

RBS Approach to CRM Calculation

Modelling Approaches and Assumptions

Price Risk

• Simulation of Market

Default Risk

Appendix

Computational Implementation of CRM

Page 34: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

34

Default RiskSummary Schematic – Explanation

Simulating Defaults Simulate defaults over a 1-year liquidity horizon

Approach employs same PD/Default correlation structure as IRC

• Through-the-cycle (i.e., long term) PDs based on, for example, historically experienced default rates

• However, we don’t know what stage of the credit cycle we will be in 1 year in the future. Hence we need to stress these PDs.

• Use a Merton firm value (Gaussian copula) type approach – familiar from IRC as described by the IRB.

• Stress the common factor to give default correlation/contagion effects

• Non-default spreads driven by the same systematic effects

• We need to integrate over systematic effects (use Gauss-Hermite if 1 factor model, Monte Carlo if multi factor)

Recovery rates randomised (driven by systematic effects)

Therefore we have a set of defaulted names (defaulted as per “real world” dynamics) and the times of default up to 1 year

Valuation Given a set of defaults over 1 year, we would expect that the spreads of the non defaulted firms will have changed:

• If we do not allow contagion, FTD baskets would always make money on a default

We need to know the form of the entire market – index tranche prices, CDS spreads, FX, yield curves, basis etc. We propose to do this by using the value of the common factor to pick out dates where the empirical cumulative probability of the Itraxx / CDX index level corresponds to the cumulative probability of the common factor.

Page 35: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

35

Default Risk Detailed Explanation Then we

• Revalue the portfolio under the given market scenario incorporating randomized recoveries and defaults and spreads blown out (V1)

• Revalue the portfolio under the given market scenario incorporating spreads blown out but with no defaults (V 2).

• Default p/l = V1 -V2

The impact of price risk is already captured

• A series of default events will cause the spread environment to change (possibly markedly). The aim is to capture this cross effect – these products are nonlinear!

Tail risk identified by 2-stage estimation process (Importance sampling)

1. Large number of simulations (10,000) using approximate revaluation

Select subset (1,000) giving largest approximate losses

2. Compute corresponding losses using exact revaluations

Find appropriate tail average of these

Page 36: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

36

Default Risk - Modelling Contagion Effects

Page 37: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Optimisation – Improving on Stein ?

The 1 factor Gaussian copula (with random recovery) is very popular because rapid computational schemes exist

• All such schemes are predicated on the fact that after conditioning on the common factor credits become conditionally independent

• The standard approach - the so called ASB algorithm - computes this conditional loss distribution using recursion and is essentially exact

• Various approximations –all of which seek to approximate this conditional loss distribution exist

• Probably the most accurate approach in the literature is an application of the Stein approximation (El Karoui, 2008)

• We have implemented Stein and extensively investigated its use for this problem. We have also developed an alternative (novel – i.e., not seen in the literature) Poisson approximation

• Both approaches are significantly quicker than standard recursion (factor ~3)

• Both methods have been compared with Random Recovery Recursion on actual Index Tranche and Bespoke portfolios, for a range of:

– Spread Scenarios– Correlations– Maturities– Attachments / Detachments– Our testing has encompassed stress events such as those produced by our market simulation.

37

Page 38: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Normal Approximation

• The conditional loss distribution is bounded between 0 and the (factor-dependent) maximum loss. When portfolio expected loss is not too low or high the loss distribution can be close to normal.

• Otherwise, however, the distribution can accumulate at either extreme and the normal approximation deteriorates.

• The figures below compare a 100-name homogeneous loss distribution with its approximating normal for different levels of expected portfolio loss. Extreme low or high expected losses will always arise since we are integrating across the market factor.

• (Note that these figures are qualitative comparisons only, where discrete distributions are normalized by grid size. The tranche prices themselves give the true quantitative comparison.)

38

Normal Approximation : Expected Portfolio Loss=0.01

0

5

10

15

20

25

30

35

40

45

0 0.05 0.1 0.15 0.2

Exact

normal

Normal Approximation : Expected Portfolio Loss=0.10

0

2

4

6

8

10

12

14

0 0.05 0.1 0.15 0.2

Exact

normal

Normal Approximation : Expected Portfolio Loss=0.30

0

1

2

3

4

5

6

7

8

9

10

0.2 0.25 0.3 0.35 0.4

binomial

normal

Page 39: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Standard Poisson Approximation

• A poisson distribution is a natural approximation to the true conditional loss distribution when expected losses are low.

• The figures below compare the same 100-name homogeneous distribution with the usual poisson approximation. As portfolio expected loss increases the accuracy deteriorates.

• The range of accuracy of the poisson and normal are complementary so a threshold for expected loss can be specified at which the approximation changes from poisson to normal. For the example here this would typically be set around 0.10 to 0.15.

• If recoveries are inhomogeneous however the distribution will be sparse with a small loss unit or grid size, and the standard poisson approach becomes problematic.

39

Std Poisson Approximation : Expected Portfolio Loss=0.01

0

5

10

15

20

25

30

35

40

45

0 0.05 0.1 0.15 0.2

binomial

std poisson

Std Poisson Approximation : Expected Portfolio Loss=0.01

0

2

4

6

8

10

12

14

0 0.05 0.1 0.15 0.2

binomial

std poisson

Std Poisson Approximation : Expected Portfolio Loss=0.01

0

1

2

3

4

5

6

7

8

9

10

0.2 0.25 0.3 0.35 0.4

binomial

std poisson

Page 40: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Adjusted Poisson Approximation

• The standard poisson approximation uses the same loss grid as the true distribution. If instead we allow the approximating poisson to have its own loss unit we have an extra parameter and a more flexible approach.

• At low expected losses, the adjusted poisson is very similar to the standard poisson and the grid size is very close to that of the true distribution (homogeneous in this example).

• As expected loss increases the grid size decreases and the adjusted poisson smoothly changes over to be very close to normal.

40

Adj Poisson Approximation : Expected Portfolio Loss=0.01

0

5

10

15

20

25

30

35

40

45

0 0.05 0.1 0.15 0.2

binomial

adj poisson

Adj Poisson Approximation : Expected Portfolio Loss=0.01

0

2

4

6

8

10

12

14

0 0.05 0.1 0.15 0.2

binomial

adj poisson

Adj Poisson Approximation : Expected Portfolio Loss=0.01

0

1

2

3

4

5

6

7

8

9

10

0.2 0.25 0.3 0.35 0.4

binomial

adj poisson

Page 41: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

0.10.3

0.61.0

1.41.8

4.08.0 5%

15%25%

35%45%

55%65%

75%85%

95%

-1.00%

0.00%

1.00%

2.00%

3.00%

4.00%

5.00%

6.00%AssetLegPV Rel Error

(Stein - FullRec)

spread scaling facor flat correlation

Pricing Error: 0-3% Tranche 5Y ( CDX9 portfolio )

5.000%-6.000%

4.000%-5.000%

3.000%-4.000%

2.000%-3.000%

1.000%-2.000%

0.000%-1.000%

-1.000%-0.000%

0.10.3

0.61.0

1.41.8

4.08.0 5%

15%25%

35%45%

55%65%

75%85%

95%

-1.00%

0.00%

1.00%

2.00%

3.00%

4.00%

5.00%

6.00% AssetLegPV Rel Error (Poisson - FullRec)

spread scaling factor flat correlation

Pricing Error: 0-3% Tranche 5Y ( CDX9 portfolio )

5.000%-6.000%

4.000%-5.000%

3.000%-4.000%

2.000%-3.000%

1.000%-2.000%

0.000%-1.000%

-1.000%-0.000%

Price Differences versus Random Recovery Recursion: 0 – 3% Tranche

Stein Poisson

Comparison (Poisson vs Stein)

41

Page 42: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Price Differences versus Random Recovery Recursion: 9 – 12% Tranche

Stein Poisson

0.10.3

0.61.0

1.41.8

4.08.0 5%

15%25%

35%45%

55%65%

75%85%

95%

-40.00%

-35.00%

-30.00%

-25.00%

-20.00%

-15.00%

-10.00%

-5.00%

0.00%

5.00%

10.00%AssetLegPV Rel Error

(Stein - FullRec)

spread scaling facor flat correlation

Pricing Error: 9-12% Tranche 5Y ( CDX9 portfolio )

5.000%-10.000%0.000%-5.000%-5.000%-0.000%-10.000%--5.000%-15.000%--10.000%-20.000%--15.000%-25.000%--20.000%-30.000%--25.000%-35.000%--30.000%-40.000%--35.000%

0.10.3

0.61.0

1.41.8

4.08.0 5%

15%25%

35%45%

55%65%

75%85%

95%

-40.00%

-35.00%

-30.00%

-25.00%

-20.00%

-15.00%

-10.00%

-5.00%

0.00%

5.00%

10.00% AssetLegPV Rel Error (Poisson - FullRec)

spread scaling factor flat correlation

Pricing Error: 9-12% Tranche 5Y ( CDX9 portfolio )

5.000%-10.000%0.000%-5.000%-5.000%-0.000%-10.000%--5.000%-15.000%--10.000%-20.000%--15.000%-25.000%--20.000%-30.000%--25.000%-35.000%--30.000%-40.000%--35.000%

Comparison (Poisson vs Stein)

42

Page 43: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Stein vs Full Recursion (Bespoke tranches)

0%

10%

20%

30%

40%

50%

60%

70%

80%

PV difference as % of Notional

nu

mb

er

of

trad

es a

s f

racti

on

of

bo

ok s

ize

Poisson vs Full Recursion (Bespoke tranches)

0%

10%

20%

30%

40%

50%

60%

70%

80%

PV difference as % of Notional

nu

mb

er

of

trad

es a

s f

racti

on

of

bo

ok s

ize

Price Difference Comparison for all Bespoke Tranches

Comparison (Poisson vs Stein)

43

Page 44: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Default Risk Approximation

Default Risk modelling is time consuming

• Under full revaluation - each time a default occurs, the model must, for each trade:– Remove defaulted name from portfolio– Calculate expected recovery of defaulted name– Calculate adjusted portfolio expected loss– Iteratively Re-calibrate loss fraction curve(s) based on new portfolio expected loss– Re-price adjusted tranche using new attachment and detachment points

In scenarios where we are simulating a number of defaults occurring (i.e. tail risk), computation times increase dramatically

44

Page 45: Modelling the CRM for the Correlation Trading Portfolio Dherminder Kainth, Jan Kwiatowski & Douglas Muirden Royal Bank of Scotland May 19, 2010.

RB

S00000

Default Risk Approximation – PV Interpolation The time consuming step in calculating the PV impact of defaults is the recalibration of the tranche loss fraction curve

required for the new portfolio after defaulted names have been removed

For calculation of the PV of a tranche on a portfolio that has experienced defaults, this recalibration step can be circumvented if we keep the portfolio the same, but readjust the tranche attachment point by the loss amount:

Operationally, for each trade portfolio, the (mid spreads and durations of) 15 tranches beneath the attachment point of the original tranche are pre-calculated

• These tranches are of the same tranche thickness as the original transaction

• The specific pre-calculated tranches depend on the original tranche attach and detach

Based on the simulated number of defaults, a loss amount is calculated and the corresponding loss in subordination of the original tranche is calculated

The PV of the defaulted tranche is calculated based on interpolating the subordination adjusted curve against the pre-calculated tranches

Tranche

Default Loss

TrancheTranche

Recovery

Loss0%

100%

1%

5%

6%

0%0.5%

Loss

100%

5.5%6.0%

4.5%

Full Reval PV Interp

Tranche

0%

100%

5%

6%

Simulate Defaults

Calculate PV Impact

95%

6%

5%

100%

0%0%0.5%

45