Top Banner
JUNE 2012 Volume 7 June 2012 The CVA Desk: Pricing the True Cost of Risk _ P.18 The Optimization of Everything: Derivatives, CCR and Funding _ P.26 Through the Looking Glass: Curve Fitting _ P.32 The Social Media World: What Risk Can Learn From It _ P.38 Stochastic and Scholastic: The Interconnectivity of Risk _ P.44 BACK TO THE FUTURE REVISITING CAPITAL AND THE BANK OF TOMORROW
54
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Algo think0612-june12

JUN

E 2012

Volume 7June 2012The CVA Desk: Pricing the True Cost of Risk _ P.18

The Optimization of Everything: Derivatives, CCR and Funding _ P.26

Through the Looking Glass: Curve Fitting _ P.32

The Social Media World: What Risk Can Learn From It _ P.38

Stochastic and Scholastic: The Interconnectivity of Risk _ P.44

Back to the

Future Revisiting capital and the bank of tomoRRow

Not all risks are worth taking. Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics, we help clients to see risk in its entirety. This unique perspective enables financial services companies to mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed decision making through the science of knowing better.

algorithmics.com

Page 2: Algo think0612-june12

Not all risks are worth taking. Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics, we help clients to see risk in its entirety. This unique perspective enables financial services companies to mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed decision making through the science of knowing better.

algorithmics.com

Not all risks are worth taking. Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics, we help clients to see risk in its entirety. This unique perspective enables financial services companies to mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed decision making through the science of knowing better.

algorithmics.com

Page 3: Algo think0612-june12

BACK TO THE FUTURE P12

Revisiting Capital and the Bank of TomorrowTHE CVA DESK P18

Pricing the Cost of Risk at Societe GeneraleTHE OPTIMIZATION OF EVERYTHING P26

OTC Derivatives, Counterparty Credit Risk and FundingTHROUGH THE LOOKING GLASS P32

An Empirical Look at Curve Fitting Counterparty Credit Risk ExposuresTHE SOCIAL MEDIA WORLD P38

(and what risk can learn from it)STOCHASTIC AND SCHOLASTIC P44

Assets, Liabilities and the Interconnectivity of Risk

BEST OF P02 Recent Awards and Recognitions

OPEning BEll P03 Responses to Uncertainty

in COnVERSATiOn P04 IBM’s Brenda Dietrich

in REViEW P08 Earth Audit

REAding ROOm P10 A Roundup of New and Noteworthy Titles

ThE lAST WORd P50 Risk Man’s Desk

Volume 7June 2012

PUBliShERMichael Zerbs

EdiTORiAl And ART diRECTiOn Touchback

COnTRiBUTORSLeo ArmerAndy AzizDavid BesterBob BoettcherTom ChernaikMike EarleyJon GregoryFrancis LacanAlan KingGary KingJohn MacdonaldCesar MoraDavid MurphyYaacov MutnikasMartin Thomas

PROdUCTiOn &diSTRiBUTiOnmAnAgERElizabeth Kyriacou

COnTACT inFORmATiOnAlgorithmics,an IBM Company185 Spadina AvenueToronto, Ontario, CanadaM5T 2C6416-217-1500

[email protected]/think

© 2

012

Alg

orith

mic

s Sof

twar

e LL

C, a

n IB

M C

ompa

ny. A

ll rig

hts r

eser

ved.

You

may

not

repr

oduc

e or

tran

smit

any

part

of t

his d

ocum

ent i

n an

y fo

rm o

r by

any

mea

ns, e

lect

roni

c or m

echa

nica

l, inc

ludi

ng p

hoto

copy

ing

and

reco

rdin

g, fo

r any

pur

pose

with

out t

he

expr

ess w

ritte

n pe

rmis

sion

of A

lgor

ithm

ics S

oftw

are

LLC

or a

ny o

ther

mem

ber o

f the

Alg

orith

mic

s gro

up o

f com

pani

es. T

he m

ater

ials

pre

sent

ed h

erei

n ar

e fo

r inf

orm

atio

nal p

urpo

ses o

nly a

nd d

o no

t con

stitu

te fi

nanc

ial, i

nves

tmen

t or r

isk m

anag

emen

t adv

ice.

 Table of Contents

Page 4: Algo think0612-june12

BEST OF

BEST RiSk mAnAgEmEnT TEChnOlOgy PROVidER, hFmWEEk’S EUROPEAn hEdgE FUnd

SERViCES 2012. BEST glOBAl dEPlOymEnT FOR AlgORiThmiCS’ COllATERAl

mAnAgEmEnT CliEnT Bny mEllOn, AmERiCAn FinAnCiAl TEChnOlOgy AWARdS

(AFTAS) 2011. FiRST PlACE FOR RiSk mAnAgEmEnT – REgUlATORy/ECOnOmiC CAPiTAl

CAlCUlATiOn, STRUCTUREd PROdUCTS TEChnOlOgy RAnkingS 2012. FiRST PlACE

OVERAll FOR EnTERPRiSE-WidE RiSk mAnAgEmEnT And FiRST PlACE in EnTERPRiSE-

WidE mARkET RiSk mAnAgEmEnT, RiSk dAShBOARdS, RiSk AggREgATiOn, RiSk

CAPiTAl CAlCUlATiOn (ECOnOmiC) And COllATERAl mAnAgEmEnT in RiSk mAgAzinE’S

RiSk TEChnOlOgy RAnkingS 2011. REAdERS’ ChOiCE WinnER (highly COmmEndEd)

FOR BEST RiSk mAnAgEmEnT PROdUCT OR SERViCE, BAnking TEChnOlOgy AWARdS 2011.

FiRST PlACE in mARkET RiSk mAnAgEmEnT And Alm, ASiA RiSk TEChnOlOgy

RAnkingS 2011. BEST RiSk AnAlyTiCS PROVidER, WATERS RAnkingS 2011.

BEST SOlVEnCy ii SOFTWARE PACkAgE, liFE & PEnSiOn RiSk AWARdS 2011.

FiRST PlACE OVERAll FiRST PlACE FOR SCEnARiO AnAlySiS, kEy RiSk indiCATORS,

And OPERATiOnAl RiSk lOSS dATA COllECTiOn, OPERATiOnAl RiSk & REgUlATiOn

SOFTWARE RAnkingS 2011. ShORTliSTEd, BEST POST-TRAdE RiSk mAnAgEmEnT

PROdUCT FOR AlgO COllATERAl, FinAnCiAl nEWS AWARdS FOR ExCEllEnCE in

TRAding & TEChnOlOgy, EUROPE 2011.

Our commitment to innovation has helped Algorithmics earn a number of public recognitions from industry publications, reader surveys, and judged panels year after year. Below is a list of awards we recently received.

2

TH!NK JUNE 2012

Page 5: Algo think0612-june12

Recent elections in France and Greece have added a new chapter to the ongoing sovereign debt crisis in Europe. At the time of this issue going to print, Greek voters turned on the Conservative New Democracy and Socialist PASOK, two parties that have defined Greek politics for decades. New Greek parties from the left and right are divided in outlook but united in opposition to EU-IMF bailouts and their widely unpopular austerity measures.

Michael Zerbs Vice President, IBM Risk Analytics

In France, François Hollande has replaced former President Nicolas Sarkozy. “Europe is watching us,” said Hollande during his victory speech. “At the moment when the result was proclaimed, I am sure that in many countries of Europe there was relief and hope: finally austerity is no longer destiny.” Yet following both elections, Chancellor Angela Merkel of Germany clearly stated that neither she nor her government were interested in reopening the eurozone fiscal pact, or the strategy of deficit-cutting austerity measures.

What is the appropriate response in times of uncertainty and conflicting views on future direction? This has been an issue for financial service firms since the financial crisis. Regulators, governments and analysts have called for financial firms to change the way they do business.

One way that firms may be able to respond is by looking to how they have managed uncertainty in the past. In “Back to the Future”, this issue’s cover story revisits capital and its role in the bank of tomorrow. When early banks operated as partnerships with personal liability attached, every decision regarding

capitalization and risk profiles was owned by decision makers. The impact of this framework on their business holds inter-esting implications.

Elsewhere in our pages are other features that explore new approaches to existing challenges. These include a look at intercon-nectivity and stochastic modeling, risk and social media, and the CVA desk’s function of pricing the true cost of risk. In “Through the Looking Glass” we return to the topic of curve fitting, with an empirical look at how chief risk officers and supervisors can gain critical insights into major exposures they would otherwise be unable to obtain.

In finance and politics, there will always be an element of uncertainty. As an industry and as global citizens, we will continue to identify and respond to the challenges of our times by searching the past, and also for solutions that have yet to be constructed.

OPEning BEll

3

Opening Bell

Page 6: Algo think0612-june12

4

Page 7: Algo think0612-june12

Brenda Dietrich has spent her professional career with IBM Research, and recently became the company’s first CTO of Analytics Software. In this issue’s conversation, Brenda discusses the nature of research, new data streams, and how the way we think about information is changing.

Th!nk: You have been connected with IBM Research since the mid-1980s. Has the company’s approach to research changed over this span?

BREndA: It has changed quite a lot. In my early days with the group, IBM Research most closely resembled a think tank. Our job was to figure out cool things one could do with computing and computers first, and then to try and establish a shared vision within the company’s product lines. In that period we invented some wonderful things and published papers and patents. After we were done, it fell to others to find applications for our work. Over time, it has become more of a shared responsibility to connect our work with IBM product and service lines.

In the last decade or so, we in the Research division have been much more tightly engaged with actual end users. Part of our role is now to understand how people approach computing, how they would like to use computing, and doing experiments in the art of the possible with real people. And that is a huge amount of fun.

5

Page 8: Algo think0612-june12

Th!nk: What would be an example of that type of compliance?

BREndA: Think about the GPS in your car. I don’t always follow the instructions mine gives me. And I really wish that she would keep track of what I do and learn that “Brenda prefers this street to that route,” for whatever reason and be responsive to that, rather than just yell at me and recalculate every time.

Th!nk: I would too. The information GPS devices pick up represents new data streams, which are a big focus of the 2012 GTO. What streams are out there?

BREndA: We’re most familiar with structured data, which is generally numeric and tends to be nicely organized. You can find each of the pieces of it that you want, and nothing else. You can do queries against structured data. You can find averages, and ranges, and apply standard deviations.

A lot of people say structured data is data you do arithmetic on, but a lot of properly formatted text data is also structured. For example, the name field in a client record. You can’t average two names or talk about a range of names; that doesn’t make any sense. But you can match names against one another in a way to say, “these two instances are actually the same person versus they are different people.”

With geo-spatial data, we tend to be computing along varied types of metrics, so what you tend to do with location data is compute distances. People we try to count. And then we try to categorize them.

Most catalog data is now also fairly well structured.You couldn’t do things like Amazon searches if their catalog data weren’t reasonably well structured. Now, it still may be imperfect, but it’s far less imperfect than five years ago.

Th!nk: Catalog data takes us online, which is where most unstruc-tured data exists. How do you define it, and what is its significance?

BREndA: I don’t have a concise definition for unstructured data, but I would say that unstructured data is the data that we don’t yet know how we are going to be computing (as opposed to just storing, copying, and accessing) with. A lot of free form text data is unstructured. Once you have annotated it and tagged it with the associated metadata, you can say, “this sentence is about this person. It’s about where she lives. It’s about where she works. It’s about her relationship with other people.” When you have all of that in the metadata it starts to move towards the structured space.

But when it is just free form text, without the meta data, that’s unstructured. Most instances of data are very unstructured, like voice data or tweets. Then it gets more complicated when we include data coming off of sensor systems. We don’t know what we are going to do with it yet, and right now it is just drilling random measurements. We know what each field means, we don’t yet know how we are going to be computing on them. So that is kind of in a strange land.

Th!nk: So the next challenge is to incorporate these new data streams?

BREndA: Yes. Let’s talk about a really simple example – retail forecasting. The easily acquired structured data comes off the point of sale device. You obtain huge amounts because every item is scanned. You can sort through that data and count the number of a given item bought each day at each location. You can keep track of what combinations are bought together. And you can do simple time series to forecast how much it will evolve in the future.

You can also pull in other sources of data like weather, advertising events, or a single time event, and begin to understand how they impact the consumption, demand, and purchasing of these items. If you extract the effect from the underlying signal, you can propagate that signal forward using your usual time series methods. You can then look at when these events are going to happen in the future, and put the multipliers back in. It can lead to an even better job of forecasting.

Th!nk: Why the emphasis on working with people?

BREndA: Ten years ago, the research lab was focused on the algorithm. The operating model for the math team was, “someone gives me the mathematical representation of the business problem and I’ll work on the algorithm to solve it.” It would return a mathematical representation of the solution, or perhaps a code, and it was someone else’s role to fit that into the business process.

For the relatively static problems we were looking at then, like flight schedules for airlines or production schedules for the manu-facturing floor, this worked reasonably well. But we now live in a world where things are much more dynamic. The easy stuff has been done. As we try to push the use of mathematics to support business decision making, we are working in problem domains that are much more subtle. Fine differences between the way two different enterprises in the same industry operate come into play.

Th!nk: Has the direction of research remained consistent, or has it evolved in surprising directions?

BREndA: The GTO (Global Technology Outlook) is an annual process in the Research division. I have been directly involved in every GTO since 1995. In the 1990s, our focus was much more about speeds and feeds: how fast would storage be – how dense would storage be? How fast would access be? How many compact transistors and circuits would fit onto a chip? I’d estimate the split at that time was around 50% hardware, 50% how the hardware will be used. This past year it was maybe 10% hardware, and 90% how the hardware will be used.

Th!nk: What caused the change?

BREndA: For many applications hardware is good enough now, whereas that wasn’t the case 15 years ago. It’s not that we are struggling with the challenge of how to do the things we want to do with computers. Now we are saying, “ok, we have this enormous wealth of data and computational power. What can we do with it?” In other words, it is much more focused on how we can use technology, less on how we can progress technology along its natural course.

We are also now more concerned with other perspectives. I would call this modeling compliance: do the people do with computers things they should do, and can they (both the computers and the people) adapt?

“The name of The game righT now is To find insighT fasTer Than anybody else.”

6

TH!NK JUNE 2012

Page 9: Algo think0612-june12

Th!nk: What would that role be?

BREndA: Within a company like IBM, analytics touch almost every part of our internal operations. We use analytics in human resources. We use them in supply chain. We use them in our own financial planning. We use them in our own risk analysis. We use them in facility planning. We use them in compensation planning. They’re everywhere.

There is a danger however, of each individual group using different tools and different data to make the same sets of decisions. And a Chief Data Officer, who may or may not be your CIO, may be charged with the one source of the truth. They are the keepers who know the data of record for everything, as there should be.

But there’s more than one way to analyze data. There are different techniques. This is a field where it does require some understanding of the theory behind the methods. In an enterprise that is using analytics and multiple business functions, having someone who will say, “this is our strategy; these are the tools we will use; this is how we will share; this is how we will be more efficient.”

It’s about more than having the same data go into two analysis processes. It is about the assumptions and the methodologies that are used in the analysis processes. That’s why I think a Chief Analytics Officer is going to be very important in the future in companies.

Th!nk: It sounds like the data we gather is changing, and the ways we gather that data are changing too. Does this mean we need to change the way we think about data itself?

BREndA: As someone who grew up believing in the scientific method, it’s my comfort zone. Researchers and scientists generally say, “I want to look at the data, I want the data to inspire hypothesis, and then I want a way to test that hypothesis.”

We can’t necessarily create new experiments with everything that we come across today. What we can observe, which is vast, isn’t the law of physics. We almost have to think of ourselves differently. Perhaps as researchers we should no longer think of ourselves as laboratory scientists, but more like astronomers. We have great tools to observe the stars, but we can’t move them around.

Th!nk: And data streams can be used to look at individual decision making as well.

BREndA: To stay with retail for a moment, the area that everyone is looking at, especially for big-ticket items, is intent to buy. This is what people tweet about, what they post on social media sites, on various blogs, and mention in comment fields on sites like Amazon.

These are activities that tend to be done before the buy happens. A lot of them are probably noise. And so the two things you will want to do is to detect a mention of a product, and you want to keep track of the mentions of that product. You can review the data by source, by what type of person, by time, and then compare that to the actual buys of the product that occurred at some later point. You want to understand – is there a decent correlation here? Is expression of intent a signal to buy? How powerful of a signal is it?

If this process gives differentiating insight to one of the actors in an economic ecosystem that the other actors don’t have, it helps create an advantage. And the name of the game right now is to find insight faster than anybody else.

Th!nk: Moving away from retail, what does the future hold for high performance users?

BREndA: Over time I think it will become important to gain a better understanding of how data is used in combination. One of the first uses of mathematical algorithms to control how a computer actually worked dates back to the big flat platter disc drives. For those drives, you had to decide which track you were putting what data in. And you did the analysis up front of which data were you most likely to be accessing most often, so that data could go in the center track. The least frequently accessed data was at the very center and at the very edge.

You did this because the center is the point from which, no matter where the head happens to be, on average it’s the shortest distance to get to. So, this was a really important algorithm. It sped things up tremendously in terms of data access.

Th!nk: How does that relate to using data in combination?

BREndA: Most of the advanced analytics we use pull in multiple sets of data. We might be pulling in bigger historical data sets when we are looking at one economic measurement versus another. Moving forward, we may want to pull in event information as well. We may want to see if an event, or a publication, or a blog, or some other signals affect our targets, and with what duration.

The goal would be to pull multiple sources of data and try to determine if one piece informs another piece in any way. Can we compute “a priori” the frequency with which two pieces or classes of data are going to be used together, and figure out some way to store them so that we can get them ready and together, at the same time? Because you can’t start the computations until you have both of those pieces. As long as the memory is in one place and the information is on a magnetic drive somewhere, you bring one in first and then wait for the other.

If we could fetch them together because they were the same distance apart, if you will, it would be very interesting. That’s my notion of using data together.

Th!nk: Let’s talk for a moment about titles. A Chief Risk Officer is common. I have heard people start to speak about a Chief Data Officer.

BREndA: I would want to be Chief Analytics Officer.

“we should no longer Think of ourselves as laboraTory scienTisTs, buT more like asTronomers.”

7

In Conversation

Page 10: Algo think0612-june12

in REViEW

Earth AuditThe finite supply of natural resources drives economies and influences pricing. Even though their costs may not be directly factored into all products and services, availability of key minerals can impact operational risk, markets, and capital. Using rough calculations, this energy audit illustrates how increases in living standards affect the rate of consumption, and brings an eye-opening perspective to the state of our planet.

IF DEMAND GROWS…Some key resources will be exhausted more quickly if predicted new technologies appear and the population grows

ANTIMONY 15-20 years SILVER 15-20 yearsHAFNIUM ˜10-years TANTALUM 20-30 yearsINDIUM 5-10 years URANIUM 30-40 yearsPLATINUM 15 years ZINC 20-30 years

SOURCE: ARMIN RELLER, UNIVERSITY OF AUGSBURG; TOM GRAEDEL, YALE UNIVERSITY

8

TH!NK JUNE 2012

Page 11: Algo think0612-june12

© 2007 Reed Business Information - UK.All rights reserved. Distributed by Tribune Media Services

9

In Review

Page 12: Algo think0612-june12

There’s a connection between our thought process and the courses ofaction we choose. These new and noteworthy titles explore the science of decision making, and the impact ideas can have on society as a whole.

The Price of Civilizationby Jeffrey SachsRandom House

Poor Economicsby Abhijit Banerjee, Esther DufloPublicAffairs

Finance and the good Societyby Robert ShillerPrinceton University Press

Too Big to FailDirected by Curtis HansonHBO Films

Paper Promises by Philip CogganAllen Lane

SCiEnCE +James Gleick digs deep into The Information, a journey from the language of Africa’s talking drums to the origins of information theory. Gleick explores how our relationship to information has transformed the very nature of human consciousness. Charles Seife examines the peculiar power of numbers in Proofiness, an eye-opening look at the art of using pure mathematics for impure ends. In Being Wrong, Kathryn Schulz wonders why it’s so gratifying to be right and so maddening to be mistaken – and how attitudes towards error affect decision making and relationships. John Coates reveals the biology of financial boom and bust in The Hour Between Dog and Wolf. Coates, a trader turned neuroscientist, shows how risk-taking transforms our body chemistry and drives us to extremes of euphoria or depression.

SCREEning ROOm +Based on the bestselling book by Andrew Ross Sorkin, Too Big To Fail reshapes the 2008 financial meltdown as a riveting thriller. Centering on U.S. Treasury Secretary Henry Paulson, the film goes behind closed doors for a captivating look at the men and women who decided the fate of the world’s economy in a few short weeks.

REAding ROOm

10

TH!NK JUNE 2012

Page 13: Algo think0612-june12

Proofinessby Charles SeifeViking Adult

The informationby James GleickVintage

The hour Between dog and Wolfby John CoatesRandom House

Being Wrong by Kathryn SchulzEcco

SOCiETy +Jeffrey Sachs has travelled the world to help diagnose and cure seemingly intractable economic problems. In The Price of Civilization, Sachs offers a bold plan to address the inadequacies of American-style capitalism. Abhijit Banerjee and Esther Duflo offer up a ringside view of Poor Economics, arguing that creating a world without poverty begins with understanding the daily decisions facing the poor. Philip Coggan’s Paper Promises examines debt, the global finance system, and how the current financial crisis has deep roots – going back to the nature of money itself. Robert Shiller believes that finance is more than the manipulation of money or management of risk. In Finance and the Good Society, Shiller calls for more innovation and creativity so that society can harness the power of finance for the greater good.

11

Reading Room

Page 14: Algo think0612-june12

12

Page 15: Algo think0612-june12

13

Page 16: Algo think0612-june12

ver the decades there have been many views on

what the bank of the future would be. Some ideas have

been radical and some have been transitional, while others never really

took hold. This particular view begins in Pawtuxet, Rhode Island, on the eastern seaboard of the United States. By all accounts a lovely place to visit, Pawtuxet is well known for scenic harbour views and boating along its historic river corridor. But in the early 19th century, textile mills and coastal trade dominated its land-scape. As the community thrived and local businesses grew, the Pawtuxet Bank emerged. Common for its time, the bank was a partnership and its directors mostly merchant-manufacturers.

The Pawtuxet Bank’s directors shared personal liability in the event of loan default, or if the bank itself failed. In “The Struc-ture of Early Banks in Southeastern New England”, Naomi R. Lamoreaux recounts the events of June 1840 when the bank’s stockholders presented themselves before the Rhode Island General Assembly. The group appeared seeking permission to reduce the bank’s capitalization from $87,750 to $78,000 in order to cover losses sustained due to the death of John Pettis, one of the bank’s directors:

Pettis died in 1838 with notes worth $8,800 outstanding at the bank and endorsements amounting to at least another $1,500...(t)his loss was not sufficiently large to cause the bank to fail. Nor did depositors or the bank’s own noteholders suffer. Most (91 percent) of the bank’s loans were backed by capital rather than notes or deposits, and the stockholders simply absorbed the loss.

We are unlikely to return to the capi-talization levels or strict regional focus employed by the gentlemen of Pawtuxet. There are however crucial lessons to be learned when examining the structure and scope of early financial institutions. When we talk about addressing concerns over capital, funding, and liquidity, it just might be that what we need for the bank of tomorrow is not a new model, but rather one that takes inspiration from the bank of yesterday.

FiRST dEPOSiTSIn the early history of banking, each partner made decisions knowing they shared liability if the bank failed. As a result, choices on whom credit should

be extended to were not taken likely. Unambitious business owners with a slight but steady production of widgets were considered ideal customers. Less attractive was the dubious repayment potential of innovative or entrepreneurial types. The latter group represented the potential of a phenomenal return on investment, but only if an unproven product or process succeeded.

This philosophy was in keeping with the dominant banking theory of the 18th & early 19th centuries. The real-bills doctrine proposed that banks should restrict the extension of credit to customers involved in the transfer of existing products only. Real-bills supporters argued that by basing loans on the security of actual goods, any individual bank’s liquidity was ensured.

While there is an admirable simplicity in tying loans to tangible goods, the skyline of the 19th century was starting to change. There were towers, factories and infrastructure projects that needed to be built, without existing product to offer in exchange for funding. And these projects were poised to generate great returns.

The unlimited liability model was ill-suited to finance these projects. When shareholders’ money was directly on the line, banks had good reason to avoid speculative projects. Because the incen-tives for self-discipline were so high, banks often lent only to those they knew best. These could be local businessmen, often engaged in the same type of industry as the bank shareholders. Often bank funds became personal resources for the shareholders themselves, and this type of insider lending, or trading, would frequently make

up the majority of a bank’s exposures.As the world began to change, so did banks.

By the late 1850s, Great Britain had moved towards limited liability, with France following suit in 1867. As with unlimited liability, the logic was easy enough to follow: if a bank could diversify its investor base, there would

The adjustment (towards limited liability) would

correct what had turned out to be the too successful risk measure of personal obligation: banks weren’t

interested in funding anything risky.

14

Page 17: Algo think0612-june12

be a greater availability of credit and capital. The adjustment would correct what had turned out to be the too successful risk measure of personal obligation: banks weren’t interested in funding anything risky.

In “Early American Banking: The Significance of the Corporate Form,” Richard Sylla suggests that the tipping point away from unlimited liability originated with the New York Free Banking law of 1838 which stated, “no shareholder of any such association shall be liable in his individual capacity for any contract, debt or engagement of such association.” New York’s free banking law didn’t just make limited liability possible. It opened the door for the incorporation of banks and the freedom from personal obligation.

OnE STEP FORWARd, TWO STEPS BACkAs banks moved beyond their villages in search of capital and opportunities, the strategies and measurements used in their operation changed as well. Instead of prudence being the only driver, customer profitability and shareholder value became ongoing concerns.

Expansions, mergers, and deregulation replaced local partnerships with a mandate to maximize customer bases and profitability. Operating at an extreme opposite of the early 19th century model were institutions like Alfinanz, an offshore administration factory that functioned as a global back office for a global network of financial advisors, intermediaries, or brokers.

As banks moved from private partnerships to public corporations, shareholder demands added another voice to how bank capital and risk would be managed. Enhanced returns were a factor in banks’ decisions to pursue diversification, more complex transactions such as structured products, and other strategies that gained support from managers operating with limited liability.

In the modern era of banking, even the idea of ‘who is a customer’ was up for grabs. In the 1990s, First Manhattan Consulting Group took a leading role in introducing profit-based segments to banking. First Manhattan came to prominence with the now famous conclusion that only 20% of a bank’s customers were profitable. Their idea to focus on profitable customers only was an attractive one to banks who were seeking to improve low revenue growth, particularly in core retail products. The concept also encouraged mergers and the creation of larger banks, who were better positioned to take advantage of segmentation opportunities.

Today we see banks retreating from these drivers and measures, often forced to adjust strategy by regulation, and perhaps in retreat from acting “in loco parentis”.

DYNAMIC CAPITAL MANAGEMENTBY FRANCIS LACAN

To visualize the concept of dynamic capital management, think of flying an aircraft as close as possible to the ground. If you go too high, the cost of fuel becomes unreasonable. You cannot go negative because the option doesn’t really exist. The goal is to seek out the most efficient middle ground that best mimics the changing landscape below.

Managing capital dynamically would enable a bank to determine, on a day by day and month by month basis, the most efficient flight path for capital and allocate it accordingly. Optimization and anticipation are the two extremes guiding such decisions, and in the middle reside a big set of constraints. Basel III and its liquidity coverage ratio have restricted certain freedoms, particularly in terms of asset qualification. The other set of constraints is risk management. Liquidity is increasingly subject to risk management because there are lots of dependences in funding liquidity and the rest of the risk.

It seems liquidity is following a similar path to what happened with capital and solvency. Banks didn’t invest much in economic capital, but the strong requirement to look at regulatory capital acted as an incentive to build more analytics, more rigorous reporting, and to become more serious about addressing uncertainty with the right tools. What is more complex for capital management is to connect all the sources of information. Pulling cash flow across entities and supporting good cash management today still has a lot of room to evolve. There are for example too many overlapping systems of information that are not very good at talking to one another. Overcoming this hurdle would be a huge step towards active capital management, rebalancing, and optimization.

The current baseline for automation is extremely rudimentary. The only true mechanical element is the planning of short term inflows and outflows within the Treasury, because their contractual commit-ments are relatively easy to model. The rest is seen as shocks, and the focus seems to be on what the regulators are asking banks to address as the possibility of these shocks. As a result, banks are being pushed into modeling with greater consistency what may happen with respect to different uncertainties tied to cash flows.

In the short term, banks will have to continue on the foundations of operational management of cash and collateral, addressing regulatory requirements for cash flow modeling and forecasting, asset qualification, and scenario modeling. Together, these elements will provide a rugged foundation to move towards automated decision making, and eventually, a more automated approach to at least some aspects of capital management.

This prediction comes with a number of ‘ifs’: if you are able to have very good and trustable aggregated pooling of all internal and external balances of cash in all currencies, and if you have access to a very good repository for your treasury operations so you can see your money market for all these currencies, you could to an extent begin to automate capital allocations for particular areas of the business. Decisions on how to refinance each of these currencies, and perhaps rebalance positions into a smaller number of currencies to save costs or optimize even the risk profile of certain transactions, could in theory be automated.

This won’t happen tomorrow. But with the proper foundation, we have the technology to make dynamic capital management part of the bank of the future.

15

Back to the Future

Page 18: Algo think0612-june12

Shareholder demands, which focused exclusively on the creation of shareholder value, must now be balanced against closer regulatory scrutiny and the need to protect customer interests.

Diversification led to its own set of challenges, as it did not help spread risk well. The credit crisis demonstrated that market risk and credit risk can appear in unexpected ways, and that the need to maintain strong liquidity positions was more crucial than realized.

All the short term profitability in the world cannot help if the system isn’t stable. And today, if you want stability, every discus-sion must begin with the importance of access to capital.

CAPiTAl: ThE OnCE And FUTURE kingIn his memoir On the Brink: Inside the Race to Stop the Collapse of the Global Financial System, former U.S. Secretary of the Treasury Henry Paulson reflects back on the credit crisis. One of his conclusions is that the financial system contained too much leverage, much of which was buried in complex structured products:

Today it is generally understood that banks and investment banks in the U.S., Europe, and the rest of the world did not have enough capital. Less well understood is the important role that liquidity needs to play in bolstering the safety and stability of banks...(f )inancial institutions that rely heavily on short-term borrowings need to have plenty of cash on hand for bad times. And many didn’t.

Politicians and regulators have joined hands on capital, proposing measures that would lead to banks holding more of it. Many banks have argued against this approach, claiming that additional capital requirements would affect performance and competition. Yet recent investigations into the correlation between bank capital and profitability suggest that holding additional capital may not be a bad thing. Which is encouraging, since banks will likely have to do it anyway.

Allen Berger and Christa Bouwman’s interests are reflected in the title of their recent paper, “How Does Capital Affect Bank Per-formance During Financial Crises?” The authors examine the effects of capital on bank performance, as well as how these effects might change during normal times as well as banking and market crises. The empirical evidence led Berger and Bouwman to the following conclusions:

First, capital enhances the performance of all sizes of banks during banking crises. Second, during normal times and market crises, capital helps only small banks unambiguously in all performance dimensions; it helps medium and large banks improve only profitability during market crises and only market share during normal times.

Empirical evidence, regulatory measures, and perhaps common sense dictate that holding additional capital is a worthy goal for banks. Yet even if banks wanted to raise capital thresholds, it isn’t

as easy as flipping a switch. A lack of cheap availability and reduced funding sources have changed the capital landscape. Banks of the future must focus on the preservation and leverage of available capital, and make that capital work harder.

Part of this focus must be organizational. Allocation of capital can no longer be controlled at a business unit, subsidiary, country, or branch level. It needs to be allocated at the time of doing business with specific customers, business lines, and even at a transaction level. Dynamic capital leads to a radically different structure, where the treasury becomes the ‘owner’ of capital, lending it to deal makers on demand.

A dynamic treasury requires great understanding of the uses and cost of capital, connected to the technological ability to ‘solve’ the problem of Big Data. In the sidebars to this article, my colleagues have expanded on the linked topics of dynamic capital and managing complex data.

BAnking On ThE PASTIn the early 19th century, the UK limited banking partnerships to six members. No one is suggesting banks return to this restriction. But if we think about various aspects of the unlimited liability banking model, it appears many of their tendencies are being echoed in calls from regulators and stakeholders.

It just might be that what we need for the bank of tomorrow is not a new model, but rather one that

takes inspiration from the bank of yesterday.

16

TH!NK JUNE 2012

Page 19: Algo think0612-june12

Long-dated compensation reform and shareholder ‘say on pay’ programs can be seen as measures intended to update the shared liability and sense of ownership partners used to bring to banks. The credit crisis has driven home the importance of liquidity, and that gaining capital can be expensive – if it can even be acquired in times of a crisis. In a way this reflects the notion early bankers held that capital was expensive, and bringing in additional funds or partners would dilute earnings.

Insider lending and specialization that gave way to diversification and fewer restrictions on portfolios is being balanced by technologically enabled means to better know customers. Enhanced collateral manage-ment and approaches like CVA can be used to gain a deeper understanding of capital exposures before entering into an agreement.

If banks are to thrive in the future, preservation and leverage of available capital are crucial steps. Dynamic allocation, enabled by a treasury that quickly and effectively uses available capital in prudent ways, could perhaps be the defining characteristic of the bank of tomorrow. From the outside, these institutions would look nothing like the stakeholders of the Pawtuxet Bank, but they would be related in spirit.

DATA COMPLExITY BY LEO ARMER

In the Pawtuxet model, a small number of operating partners owned the bank’s data. It was their responsibility to collect information about their clients, and use this knowledge to guide business decisions.

Banks today have challenges managing data, in large part because the acts of collecting and analyzing information have become so separated. The greater this disconnect, the more important transparency becomes.

For both banks and clients, it’s crucial to be able to ask: “If this is my risk number, where did it originate? How do I track it? How can I see which systems it passed through, and what happened to it along the way?” Being able to take a number from a balance sheet or a general ledger and drill back to its origin provides a huge amount of confidence.

In order to make good decisions, you need to see the big picture. If data complexity is viewed purely as a technological issue, its strategic importance can be overlooked. When institutions attack data issues purely from an IT perspective, rules are created, trans-formations take place, and the data is considered ‘clean’ after going through a reconciliation process. Various systems and approaches are employed to ensure that the numbers coming out of the front system match numbers coming out of the general ledger system, and these match the numbers coming from treasury.

The problem is, as much as you can clean the data on a Monday, unless you change the people or method of entering that data, it’s going to need cleaning up again on Tuesday.

Today, a few firms are approaching data complexity from a business perspective. They have put their main focus on creating a single data warehouse where all the information is stored. This approach is based on the insight that every piece of data has a golden source: a reliable point of origin before it gets passed through different hands and different teams. It becomes as much about changing mental attitudes as it is about technical architectures.

Creating a golden source for data becomes even more crucial when we see what has happened in the last couple of years. CVA charges for example occur when a bank puts a variable fee on top of a deal, depending on whom they are trading with.

If your bank was to trade with another bank, and you had a long history with the other bank and deep insights into their credit status, the bank would likely get a better price for that trade than a small finance company from Greece who might be looking less solvent.

In these transactions, where is the golden source? Is it in the middle office or front office data? Who is taking ownership over the trades? They can’t be processed the way they used to be, otherwise you’re swimming against the current demand for real time responses. If it takes six or seven days to work out what happened when a counterparty defaults, you’re too far behind the curve.

It is becoming more common to run into or hear discussions about appointing a CDO, or Chief Data Officer. A CDO, or at least the mindset within an institution that data quality is crucial and strategically relevant, can help banks evolve beyond workarounds and create a repository of golden source data. Through a framework where standards, direction, and architecture are provided to different departments throughout an organization, the bank of the future can overcome data complexity.

17

Back to the Future

Page 20: Algo think0612-june12

18

Page 21: Algo think0612-june12

Pricing the cost of risk at Societe Generale

THE CVADESK

By Bob Boettcher

19

Page 22: Algo think0612-june12

“I wouldn,t want

to overstate it – it’s not bringing the industry to a halt. But there is increasing focus on limiting exposures, even among global banks. And that is starting to affect the way we do business.”

20

TH!NK JUNE 2012

Page 23: Algo think0612-june12

A ny time one bank takes a risk against another the probability of default exists. To offset this concern, and to support ongoing stability within the interbank market,

banks have long emphasized the impor-tance of measuring and managing coun-terparty risk. Yet over the past few months banks have becomes noticeably less comfortable trading with each other.

The recent deterioration in credit ratings that has hit many U.S. and European banks has led to a heightened sensitivity over counterparty risk. These apprehensions may not be voiced directly, but they become evident when front office trades that would have cleared in the past no longer do because credit lines have been reduced.

As head of the CVA desk at Societe Generale Corporate & Investment Banking (SG CIB), David Murphy has a unique vantage point on interbank relationships.

“I wouldn’t want to overstate it – it’s not bringing the industry to a halt. But there is increasing focus on limiting exposures, even among global banks. And that is starting to affect the way we do business.”

CVA desks have grown in popularity as banks seek more effective ways to manage and aggregate counterparty credit risk. From his seat at SG CIB, David has a bird’s eye view on the challenges associated with establishing CVA desks, and the benefits banks can realize by gaining an active view on their portfolio of credit risk.

ThE SETUP Life used to be different – at least in terms of how counterparty credit risk was calcu-lated. In the past, an interest rate swap would have been priced the same for every client. But Lehman’s default, and more re-cently the Greek sovereign stress, has changed all that. Now, no client is assumed to be truly risk free. Different prices are now expected for different clients on that same interest rate swap, depending on variables including the client’s rating and the overall direction of existing trades be-tween both parties.

Noting their emergence, and particularly their activity in the sovereign CDS market, the Bank of England defined CVA desks in their 2010 Q2 report as follows:

cA commercial bank’s CVA desk centralises the institution’s control of counterparty risks by managing counterparty exposures incurred by other parts of the bank...CVA desks will charge a fee for managing these risks to the trading desk, which then typically tries to pass this on to the counterparty through the terms and conditions of the trading contract. But CVA desks are not typically mandated to maximise profits, focusing instead on risk management.

The Bank of England’s summary captures the classic model for running a CVA desk, which Murphy has implemented at SG CIB. The classic approach incorporates three elements:

1. pricing of new trades2. transferring risk to a centralized

desk from individual desks3. hedging or otherwise mitigating the

aggregated risk on a global basisOn all new interest rate, FX, equity, or credit derivatives, CVA desks price the marginal counterparty risk for inclusion into the overall price charged to the client.

CVA is a highly complex calculation – and manually calculating that for the thou-sands of trades and potential trades that pass through a bank every day isn’t realistic. An effective automated system therefore becomes crucial to a CVA desk’s viability.

21

The CVA Desk

Page 24: Algo think0612-june12

UPSidES OF AUTOmATiOn“For a plain vanilla trade with another bank done on an electronic trading platform, our target delivery time for the price is approximately 10 milliseconds,” says David. “On the other end of the complexity spectrum, highly-structured, long-dated trades may require two or three days to calculate the CVA price. Within this range we deliver CVA pricing within timescales that don’t delay the overall trade completion.”

While automated pricing copes well with vanilla products and the speeds required for those trades, there will always be exotic trades, trades where clients have a non-standard credit story, or a trade with special risk mitigation. In these cases, David has a team of four who provide this manual pricing to Sales and Traders on request.

“We try to reduce the need for manual pricing as much as possible, but the business will always have trades where they need someone to take a closer look. There may also be situations where we think the automated pricing isn’t good enough, so we want to take a look anyway and we don’t allow the sales or trades to use the automated pricing provided,” he explains.

In the manual process, the CVA desk team often passes along suggestions to the salesperson for improving the credit risk in a trade and enabling the sales person to offer the trade at a lower credit price. Examples of that would include improving the collateral agreement with a client, or inserting a break clause.

“Via the manual process, we have educated our sales team and traders how they can change the credit risk (and reduce the price). With this knowledge, they now use the automated pre-deal CVA calculations to provide several CVA prices for different versions of the same trade. This allows them to achieve the best price for the client – while minimizing the counterparty risk.”

“Really, what our sales team are interested in, is earning as much as possible net of the CVA. Through the automated system tools, we’ve empowered sales and traders to do trades with the lowest CVA possible. So it’s worth their while spending time looking for that price, and they can now do that themselves quickly and efficiently – without the delays or extra resources required when using the manual pricing process.”

These ‘pre-deal’ checks are purely indicative – and optional for Sales and Traders. But if they don’t do this check, they face a big risk, because every time a new trade is booked, an ‘official’ CVA fee is then auto-calculated – which is then recorded alongside the trade – and will be deducted from the Sales/Trader performance.

PERilS OF PRiCing“A key challenge of building a CVA pricing system is ensuring real-time access to data in three categories: trade details; static data (client data such as rating, details of all pre-existing trades, netting status, collateral details etc), and market data.”

“Designing the system which has reliable and timely data in all 3 categories is crucial, given the impact that will have on pricing and/or hedging decisions. It’s a tough market with sophisticated competitors: if we under-price a risk, you can be sure we will start attracting a large market share. And if we over-price, then we lose business unnecessarily.”

“We TRy TO ReDuCe The neeD fOR MAnuAl pRICIng AS MuCh AS pOSSIBle, BuT The BuSIneSS WIll AlWAyS hAve TRADeS WheRe They neeD SOMeOne TO TAke A ClOSeR lOOk.”

22

TH!NK JUNE 2012

Page 25: Algo think0612-june12

hAlFWAy TO hEdging Murphy’s first priority for SG CIB has been to ensure the CVA desk correctly prices the risk in all new trades. Now that this process is well-advanced, the desk will start to focus more on hedging – or otherwise mitigating – its legacy portfolio of credit risks. “For hundreds of years banks have managed reasonably well hedging 0% of their counterparty risk. So instead of an instant seismic shift to 100% hedging of all risks, we will be hedging selected segments of the credit and market risk – which avoids paying away all CVA income to the market,” says Murphy.

For banks evolving the CVA function, there are two main reasons hedging is not further along. The first is technology: they may not yet be fully confident in their risk measurement system, which requires a complex and time-intensive development period. The second reason is strategic: the bank might not think that all of its potential hedges are very useful.

“If you take SG CIB’s total portfolio of clients, just over 10% have a liquid CDS curve. In other words, for 90% of our clients, if we wanted to go and buy CDS protection, we couldn’t do it because there’s no market. To hedge these illiquid risks, banks would need to use some kind of credit index.”

But this is an imperfect hedge. In ‘normal’ times, the credit spread of the index and the ‘generic’ spread applied to calculate the client’s CVA will move in tandem. The hedge is therefore effective in reducing earnings volatility from day-to-day changes in the CVA Reserve. But, if the client deteriorates – or even defaults – due to an idiosyncratic reason, then the index hedge may not be affected at all (i.e. the hedge doesn’t work). So the decision whether to hedge illiquid names depends on what you want to protect against: actual losses following default… or earnings volatility caused by changes in market credit spreads.

“ThROugh The AuTOMATeD SySTeM TOOlS, We’ve eMpOWeReD SAleS AnD TRADeRS TO DO TRADeS WITh The lOWeST CvA pOSSIBle.”

23

The CVA Desk

Page 26: Algo think0612-june12

ACCOUnTing FOR BASElIn the traditional CVA approach, a bank accepts a new trade, takes a fee and uses that fee to buy good hedges for all the risks in that trade. These hedges should eliminate all of the bank’s risk, but this is not necessarily the case once Basel III is taken into account.

Basel III does not recognize all types of hedges that the bank might want to use. Therefore the regulatory capital for certain trades will not be zero, even if the bank has used the full CVA fee to hedge all its risks.

The first impact Basel III has on CVA desks is on pricing. Pre-deal pricing needs to be reviewed to ensure the costs of imposed regulatory capital are covered. If not, additional pricing may need to be added. And the decision on which risks are efficient to hedge also becomes affected not just by strategic or business reasons, but also by the regulatory capital impact.

As part of Basel III’s updated regulatory capital guidelines, a new element has been added: VaR on CVA. Regulators have specified very precisely how the underlying CVA must be calcu-lated for this charge. Banks will therefore need to decide whether to adjust their pricing and balance sheet CVA to match the BIII rules, or to use different CVA calculations for pricing and Regulatory purposes.

In The TRADITIOnAl CvA AppROACh, A BAnk ACCepTS A neW TRADe, TAkeS A fee AnD uSeS ThAT fee TO Buy gOOD heDgeS fOR All The RISkS In ThAT TRADe. TheSe heDgeS ShOulD elIMInATe All Of The BAnk’S RISk…

24

TH!NK JUNE 2012

Page 27: Algo think0612-june12

…BuT ThIS IS nOT neCeSSARIly The CASe

OnCe BASel III IS TAken InTO ACCOunT.

A dEFining ROlEWhen individual trading desks own risk, one desk may have had a positive exposure to a client. This could lead the desk to hedge the positive exposure, without knowing that there was a negative exposure at another desk, which means the hedge wasn’t really necessary. Because the CVA desk owns all the risk from all the different derivatives desks, it has a full view of the risks with each counterparty, across all desks, products and locations and can price and hedge the risk appropriately.

It has been suggested that a CVA desk is just a ‘smart middle office’. Murphy doesn’t agree: “The main difference between CVA other front office trading functions is that most of the risks are originated internally from the bank’s other trading desks. But the CVA desk must price, originate, and distribute those risks in exactly the same way as any other front office trading desk.”

CVA desks have evolved to price, centralize, and manage a bank’s counterparty risks, requiring sophisticated modeling of hybrid risks encompassing every asset class that the bank is involved in. When implemented correctly, CVA desks should support an institution’s business and strategic vision, while helping banks maintain normalized relationships and control risk in an ever more complex trading universe.

25

The CVA Desk

Page 28: Algo think0612-june12

26

Page 29: Algo think0612-june12

The global financial crisis has created much excitement over counterparty credit risk (CCR) and, in recognition of this, banks have been improving their practices around CCR. In particular, the use of CVA (credit value adjustment) to facilitate pricing and management of CCR has increased significantly. Indeed, many banks have CVA desks that are responsible for pricing and managing CVA across trading functions. In addition to CVA, DVA (debt value adjustment) is often used as recognition of the “benefit” arising from one’s own default and funding aspects may be considered via funding value adjustment (FVA). Also, the impact that collateral has on CVA, DVA and FVA is important to quantify. Finally, there is a need to consider the impact of the funding requirements and systemic risk when trading with central counterparties (CCPs).

OTC Derivatives, Counterparty Credit Risk and funding

By Jon gregory

TheOPTIMIZATION

of EVERYTHING

27

Page 30: Algo think0612-june12

T he dynamics of trading OTC derivatives is becoming increasingly driven by the components mentioned above. Such a trend can only grow as regulation arising from Basel III creates the need for significantly increased amounts of capital to be held against CCR. It therefore seems likely that banks will not only invest significantly in building knowledge around the afore-

mentioned concepts but will also optimize their trading decisions. For example, should one trade through a CCP or not? Is it pref-erable to trade with a counterparty via a 2-way collateral agree-ment (CSA)? Should we collateralize via cash or other securities? What currency should I post cash collateral in?

There are a number of considerations around optimizing OTC derivatives trading with respect to CCR, funding, and systemic risk. From the point of view of a bank, an OTC derivative transaction depends very much on the type of counterparty to the trade. Most unsophisticated users of OTC derivatives will not post collat-eral against positions while more sophisticated users will post collateral or trade through a central counterparty. This creates a wide spectrum of behaviour with respect to CCR and funding aspects that we will discuss. A bank then has the issue of deter-mining how best to optimize their trading across this spectrum.

ThE imPACT OF REgUlATiOnThe Basel III rules will be phased in from the beginning of 2013 and will force banks to hold a lot more equity capital, much of which is due to CCR requirements. Ballpark estimates are that most large banks will have to more than triple the amount of equity held compared with pre-crisis. Loopholes to reduce capital requirements, such as off balance-sheet entities, are being closed. A trillion dollars or so of extra equity will need to be raised by American banks by the end of the implementation of Basel III (2019) with European banks needing to raise a similar figure. Basel III will have a profound effect on banking behaviour. The changes will make all banking activities more expensive, in particular exposures held in the trading book.

Under Basel III, the changes around CCR (that will apply to banks from 1 January 2013) are particularly significant and include:

Stressed EPE. Banks which have permission to use the internal models method (IMM) must calculate exposures using data that includes a period of stressed market conditions, if this is higher than the standard calculation. Wrong way risk. Banks must identify exposures that give rise to a greater degree of “general” wrong-way risk and must assume a higher exposure for transactions with “specific” wrong-way risk. Systemic risk. Banks must apply a correlation multiplier of 1.25 to all exposures to regulated financial firms with assets of at least $100 billion and to all exposures to unregulated financial firms.

Collateral. A “margin period of risk” of 20 days must be applied for trans-actions where netting sets are large (i.e. over 5,000 trades), have illiquid collateral, or represent hard-to-replace derivatives. The current time frame on such transaction is 5-10 days. No benefit can be achieved from downgrade triggers (e.g. receiving more collateral if the rating of a counterparty deteriorates). In addition, additional haircuts for certain securities and the liquidity coverage ratio will limit the amount of rehypothecation (reuse of collateral) and encourage the use of cash collateral. This ratio aims to ensure that a bank maintains an adequate level of unencumbered, high-quality liquid assets that can be converted into cash to meet its liquidity needs.CVA VAR. Banks must hold additional capital to capture the volatility of CVA. This is in addition to the current rules that capitalise default risk.Central counterparties. A risk weighting of 2% will be given to exposures to a CCP, not only via margin posted but also via the default fund contribution that must be made. In addition, the CCP must meet various rigorous conditions, including the establishment of a high specific level of initial margin and ongoing collateral posting requirements, and that it has sufficient financial resources to withstand the default of significant participants. While this represents an increase (from zero) in capitalization of CCP exposures, it is intended to incentivise the clearing of OTC derivatives through CCPs.

COllATERAl And CCPSCollateral arrangements involve parties posting cash or securities to mitigate counterparty risk, usually governed under the terms of an ISDA Credit Support Annex (CSA). The typical frequency of posting is daily and the holder of collateral pays an (typically

28

TH!NK JUNE 2012

Page 31: Algo think0612-june12

overnight) interest rate such as Eonia or Fed Funds.

The use of collateral has increased steadily as the OTC

derivatives market has developed. The 2010 ISDA margin survey reports

that 70% of net exposure arising from OTC derivatives transactions is collateralized. A typical CSA

converts some (but not all) of the underlying CCR into funding liquidity risk.

Despite the increased use of collateral, clearly a significant portion of OTC derivatives remain uncollateralized. This arises mainly due to the nature of the counterparties involved, such as corporates and sovereigns, without the liquidity and operational capacity to adhere to daily collateral calls. In such cases, a bank must consider the full impact of CCR and funding of the transac-tions in question. Since most banks aim to run mainly flat (hedged) OTC derivatives books then funding costs arise from the nature of hedging: Figure 1 illustrates a non-CSA trade being hedged via a trade done within a CSA arrangement.

When a counterparty does sign a CSA then the type of collateral is important. As Table 1 illustrates, the type of collateral must have certain characteristics to provide benefits against both CCR and funding costs. Firstly, in order to properly mitigate CCR, there must be no adverse correlation between the collateral

and the credit quality of the counterparty. The posting of Russian government bonds by LTCM was a real life illustration of the dangers of this form of wrong-way risk. Secondly, for collateral to provide benefit against funding costs, then it must be usable (since the economic ownership remains with the collateral giver) via rehypothecation which means it can be posted as collateral or pledged via repo.

Collateral in securities that cannot be rehypothecated reduces CCR but does not provide a funding benefit. A sovereign posting their own debt in a CSA (as discussed recently when both Ireland and Portugal agreed to sign CSAs with some counterparties) would give the opposite effect, i.e. providing a funding benefit but not satisfactorily reducing CCR. Clearly cash collateral provides benefit against both CCR and funding.

Collateral posting through CSAs is becoming more widespread and streamlined (e.g. more cash usage) but there is another force that will create even more funding requirements for CCR. The Financial Crisis that developed from 2007 onwards suggested that better ways of controlling CCR needed to be found. Policy-makers have identified the widespread adoption of central clearing of OTC derivatives as one means of achieving this. Legislation such as the Dodd-Frank Wall Street Reform and Consumer Protection Act (passed by the US Congress in 2010) and the new European Market Infrastructure Regulation (EMIR) mandate that certain OTC derivatives transactions be centrally cleared through CCPs.

CCPs must have strong risk management practices to ensure that they can come close to their perceived role as being a panacea for CCR. In order to facilitate this, OTC derivatives clearing will focus on liquid, standardized products. From a collateral point of view, a CCP will go much further than the typical terms in a CSA. Most notably, CCPs require “initial margin” which is effectively an overcollateralization to provide a buffer against potential close-out costs if a CCP member defaults. Further to this, CCPs will generally require more frequent collateral posting (intra-daily in some cases) and require more liquid collateral (cash only in many cases). Finally, CCPs will be able to essentially change collateral rules at will (equivalent to re-writing a CSA without a counterparty’s consent) such as was recently observed when European clearing house LCH.Clearnet doubled its margin requirement on Irish government bonds.

Table 1. Impact of collateral type on CCR and funding.

Is collateral in cash or can be rehypothecated?

YES No

Is collateral subject to YES Funding benefit only No benefit

adverse correlation? NO CVA and funding benefit CVA benefit only

Figure 1. Illustration of the typical way in which CCR and funding are important in OTC derivatives. A Bank trades without Counterparty A with no collateral arrangement but must enter into a collateral arrangement (CSA or CCP) with the trade used as a hedge.

CounterpartyA

Trade HedgeBank Counterparty

B

Counterparty

29

The Optimization of Everything

Page 32: Algo think0612-june12

ThE SPECTRUm OF OTC dERiVATiVES TRAdingFigure 2 illustrates the different collateral arrangements that can be present when trading an OTC derivative, ordered by their increasing impact in reducing CVA. Generally, two trends are important. Firstly, the ability of a bank to receive collateral. This is most limited with a 1-way CSA against the bank (since they must post and not receive collateral) and is at a maximum with a 2-way CSA. The impact of collateral when the transaction is centrally cleared is not as beneficial due to the need to post initial margin. However, since CCPs are supposed to be of excel-lent credit quality (or too-big-to-fail) then central clearing may be viewed by many as providing the maximum reduction of CVA. Indeed, it is unlikely that the CVA to a central counterparty will even by quantified.

OPTimizing OVER ThE SPECTRUmThere are significant factors to be considered between the main ways in which OTC derivatives may be traded, namely with no CSA, a 2-way CSA and via central clearing (Figure 3). CVA is most significant when there is no CSA and least significant under central clearing (assuming the default remoteness of the CCP). DVA, being the opposite of CVA, shows the reverse trend (most beneficial with no CSA and least beneficial under central clearing due to initial margin). Funding is least problematic with no CSA and becomes increasing intensive under a CSA (collateral) and central clearing (over collateralization). The assumption that CSA trades are more funding intensive is based on the understanding that a completely uncollateralized book of OTC derivatives would require no funding, but a fully collateralized book would require funding even if perfectly hedged, due to the mismatch between

receiving and posting collateral. Finally, capital charges will be highest for uncollateralized trades while benefit can be achieved for collateralized trades and the requirements are smallest for centrally cleared trades.

From Figure 3 we can see that there is a balance that does not lead to one form of trading being obviously most beneficial. Uncollateralized trades have the best funding and DVA situation but are the most expensive in terms of CVA and regulatory capital charges. Centrally cleared transactions have the smallest CVA and regulatory capital charges but costly funding and no benefit from DVA. CSA trades are intermediate in all senses. It is therefore not clear what is the most beneficial trading arrangement for a bank. Furthermore, there are some additional points within each category that should be considered:

Uncollateralized trades. The advantage of uncollateralized trades is that the main issue is CVA (and the associated regulatory capital charges) which a bank can attempt to quantify and manage. This makes it most straightforward to identify the cost of trad-ing at inception and incorporate this into prices. However, CVA hedging is far from trivial as, under Basel III, capital relief is not achieved on market risk hedges and only a limited relief is given for credit index hedging (commonly a single-name CDS market does not exist for the counterparty in question). CSA trades. A CSA has the impact of converting CVA into funding and liquidity costs. These are more opaque and may be harder to quantify than CVA. This has an obvious negative impact as costs are harder to define at inception but may

Figure 3. Illustration of the impact of various factors on different OTC derivative trading arrangements. The arrows denote the relative increasing cost (or benefit reduction) of each factor. For example, CVA is largest in the No CSA case and smallest under central clearing.

Uncollateralized Collateralized Overcollateralized No CSA 2-way CSA CCP

CVA

DVA

Funding

Regulatory Capital chargeFigure 2. Illustration of the various collateral terms when trading an OTC derivative.

1-wayCSA

(against)

1-wayCSA

(in favour)

NoCSA

ThresholdCSA

2-wayCSA

Centrallycleared

Reducing CVA

30

TH!NK JUNE 2012

Page 33: Algo think0612-june12

FUNDING RISK & REWARDIn the Stanford university working paper, “fallacies, Irrelevant facts, and Myths in the Discussion of Capital Regulation: Why Bank equity is not expensive”, Admati, DeMarzo, hellwig and pfleiderer argue that with higher capital requirements and restrictions on leverage, the funding cost of banks will not increase because less risky entities will naturally achieve reduced funding costs. however, this view is not in general shared by banks that, in response to the significant increase in capital requirements in relation to CCR and liquidity factors, will look to optimize their CvA, funding, and capital costs so as to maximise the profitability of their derivatives businesses.

have a positive effect that such opaque risks are by their nature less well capitalized and regulatory capital is therefore lower. Another important aspect to look at is the “cheapest-to-deliver” collateral. The cost of posting collateral is the difference between the funding of that collateral and the return paid under the CSA (normally the overnight indexed swap r at e). Multi-currency CSA s g ive optiona lity over collateral posting and with the differences between the major rates significant, choosing the best collateral is important. This is consistent with the change of deriva-tives valuation to be based on the cheapest-to-deliver collateral: that since a rational counterparty will always deliver the cheapest collateral then the implicit assumption that one’s counterparty will act in the same way is made. Centrally cleared trades. In addition to the considerations mentioned above for CSA trades, CCP trades must be assessed based on the reduced capital and initial margin requirements. CCPs also have different collateral practices, for example requiring variation margin to be posted in cash of the currency of the underlying transaction (with the relevant overnight indexed swap rate used to discount the trade). This implies a change in NPV for a book of OTC trades migrated to a CCP as the potential cheapest-to-deliver collateral terms from the CSA in question are essentially given up. For clearing members acting as intermediaries for non-clearing members and essentially providing margin lending (collateral facility), the long term cost of this must be considered.

ThE TOOlS REqUiREdIt is clear that there is much optimization possible in the trading of OTC derivatives contracts with respect to CCR, funding and regulatory capital. This can be seen from the activities of banks, such as a closer integration between collateral teams and trading desks. The precise optimization is clearly a huge challenge due to the complexity in defining costs associated with CCR and funding together with the cost of holding the required regulatory capital.

The next step for banks is to make sure they have all the tools in place to optimize their OTC derivatives trading as much as possible.

As such, banks require sophisticated systems for quantifying and managing CVA, which should be able to also consider the related impact of DVA and FVA. There is a lot of effort in revisiting derivatives valuation (for example, using OIS discounting), which should be aligned with more efficient collateral management systems to achieve the most efficient collateral posting in each situation. Dealers are currently working with the ISDA on a more standardized CSA (so that Euro swaps would be collateralized with Euros, for example) which would lead to a decline in SCA optionality. Finally, the current capital charges and future rules defined under the new Basel III regime should be factored into all trading decisions. While regulatory requirements encourage the use of collateral and CCPs, banks may not view the additional funding challenges and systemic risk that this leads to as the most preferable economic outcome.

31

The Optimization of Everything

Page 34: Algo think0612-june12

32

Page 35: Algo think0612-june12

An empirical look at curve fitting counterparty credit risk exposures

By Cesar Mora, yaacov Mutnikas and Michael Zerbs

33

Page 36: Algo think0612-june12

If ADOpTeD WIDely, CuRve fITTIng COulD Be An effeCT MeAnS Of DevelOpIng A MORe MeAnIngful yeT pARSIMOnIOuS DIAlOgue On MAJOR RISk eXpOSuReS BeTWeen fIRMS AnD SupeRvISORS.

Editor’s note: In Optimal Granularity (TH!NK, Nov. 2011), Yaacov Mutnikas and Michael Zerbs suggested that curve fitting, combined with CCR (counterparty credit risk) data at a specific level of granularity, could provide unprecedented insights into CCR namics at firms.

This follow-up article summarizes results found by empirically testing different curve fitting techniques to CCR exposures.

M any financial institutions have invested in robust and sophisticated CCR systems to be compliant with IMM (internal model method) and to address supervisors’ growing interest in assessing CCR more effectively. Replicating these systems at a transactional level may be ideal in theory, but institutions and supervisors are aware that acquiring this level of detail is impractical.

Curve fitting is an approach that could be adopted in order to measure CCR and assess the impact of stress tests without the need to replicate transaction level data. One aspect that makes curve fitting such an attractive option is its ability to rapidly add new and confidential scenarios flexibly without the need to approach firms first, as they can be introduced after information has been collected.

The purpose of curve fitting CCR is not to replicate exposures that could be generated with the combination of Monte Carlo scenarios and full valuation or to provide an equivalent level of accuracy. Rather, curve fitting is proposed to give financial institu-tions and supervisors a critical insight about major exposures to changing scenarios, which they would otherwise be unable to obtain. Curve fitting can also be useful for firms to get a quick assessment of CCR without having to perform full revaluation of their positions. If adopted widely, curve fitting could be an effective means of develop-ing a more meaningful yet parsimonious dialogue on major risk exposures between firms and supervisors.

gRAding ThE OPTiOnS In this article we are exploring whether the promise of curve fitting holds up for a realistic data set and which curve fitting approach works best. We performed an exercise together with the FSA on real world counterparty data sets to evaluate curve fitting options. The approaches explored include: linear and nonparametric curve fitting, higher order polynomials and risk factor reduction via Principal Component Analysis (PCA) and Portfolio PCA. Portfolio PCA considers the size of the exposure to each risk factor when determining the most relevant principal components.

Our empirical results suggest fitting CCR with a second order polynomial equation, using linear curve fitting with Portfolio PCA, provides the most useful insights. Further, our empirical results show that curve fitting can succeed at capturing the stochastic nature of exposure of different counterparties at various netting set levels. The fitted equations can then be used to perform accurate, forward-looking stress testing without the need to perform full revaluation or requiring specific knowledge about the transaction level data.

SETTing ThE PARAmETERSAssume that the supervisors provide the firms with around 200 multi-step scenarios. These scenarios would cover the longest maturity of the instruments present in a coun-terparty portfolio or the relevant period of interest (e.g. one year) by using a non-homogenous time grid. For example, daily time steps for the first week, weekly time steps up to the first month, monthly time steps up to five years, and so on until the desired horizon has been reached. We have previously suggested that the stress scenarios use the risk factors that the European Central Bank has defined for the 2011 European Banking Authority stress testing exercise and use Portfolio PCA, as described in the Appendix, to focus on the risk factors most relevant to each firm’s exposure. The 200 scenarios could be a combination of Monte Carlo scenarios drawn from a distribution and stress scenarios created by the supervisors. This idea is very similar to the stressed EEPE approach in the sense that stressed scenarios should be used in order to capture non-normal and higher volatility events.

Given the options, we suggest using Monte Carlo scenarios calibrated to recent data plus stress test scenarios of interest. Since curve fitting is a statistical approach, it is indispensable to have enough observations for the calibration of its parameters. Monte Carlo scenarios would give scenarios that are consistent with the normal state of the world while stress scenarios would be projections of possible crisis scenarios. The stress scenarios could be a combination of what-if scenarios, or can be taken from different periods of crisis and significant market volatility.

TH!NK JUNE 2012

34

Page 37: Algo think0612-june12

By combining “normal” Monte Carlo scenarios and stress scenarios we can fit polynomials that assume breaks in the original correlation assumptions. It is important to mention that the stress scenarios should be large enough in quantity to have an effect on the distribution of the scenarios. For example, if we choose 200 scenarios for curve fitting we would expect at least 50 scenarios come from stress scenarios.

The stress scenarios need to be rich and relevant in the sense that they need to cover a wide range of possible outcomes and boundary scenarios. If the polynomial is fitted using these boundary scenarios then the scenarios within that region will be feasible and will have some probability. However, if the stress scenarios are not relevant there will be some cases that are unachievable.

TO ThE FiRm And BACk Once the scenarios have been created, firms will use them to do a full valuation of their portfolios. After the valuation is complete, firms will provide the MtM (mark-to-market) values, positive and negative, by time step, scenario and by netting set together with the counterparty hierarchies and first order portfolio risk sensitivities to the most relevant risk factors. This should not pose any difficulties to firms with an already approved IMM system in place. Since CCR tends to be clustered, the firm can select those largest counter-parties that together drive a large part – for example, 80% - of the firm’s overall exposure and will provide the information explained above on those counterparties to the supervisors. For the portfolio risk sensitivities the counterparty will select the risk factors that contribute more to the exposure and will generate first order portfolio sensitivities by netting set.

With this dataset, we have enough counterparty and exposure information to construct polynomials at the different netting set levels without the need to replicate information at a more granular level. Specifically, we can curve fit the values using a second order polynomial and assess the goodness of fit with statistics such as r-squared and standard error. Values are turned into exposures given the counterparty hierarchies provided by the firms.

Crucially, we can assess the impact of additional stress test on the CCR exposures by applying the fitted polynomials to new scenarios without going back to the firms for more information. What-if scenarios can be constructed interactively and on-demand.

The data set that we suggest firms provide is to some extent an extension of the data template for credit exposures proposed by the FSB (Financial Stability Board). In particular, the FSB requires principal amounts, gross MtM exposures, collateral, net MtM exposures, and potential future exposures. In our approach, having the MtM values at netting set levels by scenario and applying curve fitting allows us to derive PFEs, and in the case where the scenarios include stress scenarios, we are able to obtain some

“stressed PFE” figures as well. Furthermore, we believe that first order risk factor sensitivities are also relevant pieces of information that firms should provide as exposures tend to be driven and have different sensi-tivity to specific risk factors. Our empirical results also show that, in conformity with the FSB objectives, better data can help contingency planning for stress events.

EmPiRiCAl RESUlTS And FindingS The data set used for the curve fitting analysis consisted of 18 counterparties. Each of the counterparties had a netting structure containing two subsidiaries, and each of the subsidiaries had netting, no netting sets, and CSA in place. The instrument with the largest maturity in the data set had around five years left before expiry; however, analysis of curve fitting to a one year exposure was also performed.

The data set contained approximately 20,000 instruments with the following asset types: Interest Rate Swaps, Cross-Currency Swaps, Swaptions, Equity Swaps, European Equity and Index options, repo, Forward Rate Agreements, FX option (American and Asian), and FX forwards. All the collateral was modeled as cash. There were 275 risk factors affecting the exposure of these instruments. The risk factors consisted of interest rates, exchange rate, interest rate and equity volatilities, and market indices, and the posi-tions covered 15 of the major currencies. For some par-ticular analyses we considered specific counterparties with 33 or 76 risk factors. Using Portfolio PCA and 98% to 99% variance criteria we were able to reduce a risk factor space of 275 risk factors to between 33 and 36 components.

For netting sets containing linear instruments such as vanilla or cross currency interest rate swaps, a linear curve fitting approach was sufficient to achieve a good fit of the exposure. However, netting sets containing instruments such as options required second order polynomials in order to capture the non-linearity. As we start fitting netting sets with more risk factors, dimensionality and multicollinearity issues arose, leading us to start using Principal Component Analysis. We were able to explain most of the variability with few components.

For example, we were able to reduce the risk fac-tor space from 275x275 to 33x33 dimensions and maintain successful curve fitting results. In some cases where normal PCA has been applied the optimal number of PCs seemed to explain approxi-mately 92.5%-95% of the variability. However, curve fitting was not successful in these cases and did not improve until extra components that materially impacted CCR exposures were used, even though they explained almost no risk factor variability. This led us to use Portfolio PCA where the risk factors in the curve fitting can be weighted according to their impact to exposure.

The subsequent performance of our curve fitting with Portfolio PCA improved dramatically. However, note that, as opposed to normal PCA where compo-nents are selected from the VCV of the universe of risk factors selected, in Portfolio PCA the ranking is specific to each of the netting sets. As a result, we need risk sensitivities by key risk factor for each of the netting sets in the counterparty hierarchy and the principal components would be different at each level. In our empirical test the approach with best performance was linear curve fitting with a second order polynomial with Portfolio PCA. A more detailed description of the curve fitting techniques implemented in this exercise can be found in the appendix.

35

Through the Looking Glass

Page 38: Algo think0612-june12

OuR eMpIRICAl ReSulTS hAve ShOWn ThAT uSIng CuRve fITTIng COMBIneD WITh CCR DATA AT The pROpOSeD OpTIMAl level Of gRAnulARITy AllOWS uS TO CApTuRe The DynAMICS Of eXpOSuRe AnD uSe The fITTeD equATIOn fOR fuRTheR fORWARD-lOOkIng STReSS TeST AnAlySIS.

FURThER UP ThE CURVEThrough curve fitting, supervisors and firms alike can incorporate confidential scenarios and ad-hoc scenarios into their analysis. We tested the effectiveness of curve fitting to model additional scenarios by using Out-of-Sample analysis, which consists of revaluating the fitted equation under scenarios other than the ones used to calibrate the parameters of the equation. The possibility of adding confidential scenarios is an important benefit as it provides a greater insight on-demand compared with recent initiatives such as the European Banking Authority stress tests that can be re-run periodically only and require significant modeling effort by each firm each time they are applied. Figure 1 shows that Curve Fitting approximates exposures well even under out-of-sample scenarios.

Our empirical results have shown that using curve fitting combined with CCR data at the proposed optimal level of granularity allows us to capture the dynamics of exposure and use the fitted equation for further forward-looking stress test analysis. We believe that enterprise risk management and supervisors can benefit from the curve fitting techniques to better understand CCR dynamics within individual firms, as well as across firms, without the need to disclose scenarios. The flexibility to incorporate confidential scenarios and assess possible outcomes without revealing their specific concern is of great value to risk management and supervisors, who would not have to be concerned that the effect that it is trying to be prevented could be propagated instead.

We are certainly at an early stage of using curve fitting techniques for CCR and we expect more studies, more advanced curve fitting and statistical approaches to follow to increase the accuracy of the assessment of CCR at an optimal granularity level.

APPEndix Fitting to Values or Exposures?

Given that the firms provide values, positive and negative, the supervisors will then fit the polynomials directly to values and then compute the exposure using the following formula:

Exposure(t,s)=Max(Value(t,s),0)Fitting to values would allow the assessment of CCR from two perspectives: the firm and the counterparty. Curve fitting to exposure directly becomes complicated due the non-linear nature of the function itself. For example, if we try to fit to positive exposures directly there will be extreme scenarios that generate zero exposure and there would be other non-extreme scenarios that would generate zero exposure as well. In these cases, the overall result could lead to a smoothing of the observations where risk could be under or overestimated. As a result we suggest that curve fitting is applied on values as opposed to credit exposures.

It is also important to remove cash settled from the portfolios/netting set values. Cash settled does not generate exposure and since the dataset does not include information at the transaction level it would be impossible to know the settlement dates of the individual transactions. If the cash settled is mistakenly taken into account the curve fitting to values could work correctly but the exposure profile would be wrong since it will keep growing monotonically as cash settles through time.

2 order Port. PCA (33PC)

1 order Port. PCA (33PC)

Full valuation

Figure 1. Out-of-Sample vs. Full valuation

36

TH!NK JUNE 2012

Page 39: Algo think0612-june12

FiTTing TO nETTing SETSEven though curve fitting would allow us to fit CCR at a higher level than trans- action level, it cannot be applied to the highest level in the counterparty hierarchy. The reason is that different netting rules and restrictions may apply to different asset classes according to different ISDA Master Agreements in a unilateral or bilateral way. The same case applies to collateral; trading with some counterparties may require CSA to be in place while some sets with no netting agreement would not require collateral to be held or exchanged. Hence, curve fitting has to be applied at the netting set level and then aggregate exposures all the way up to counterparty level.

Curve fitting sets where netting of exposure between transactions is allowed is quite straightforward. One just needs to use one polynomial equation to approximate the values of the particular set and then compute the credit exposure. However, curve fitting sets where netting is not allowed, is a little more compli-cated in the sense that one needs to use two curve fitting equations to model the dynamics of the complete set. It is necessary to split scenarios that gener-ate negative values from scenarios that generate positive values and then the first equation will be used to fit the positive values and the second to curve fit negative values, the exposure of the netting set for a particular scenario and time step would be the sum of the resulting exposure of these two equations.

CURVE FiTTing TEChniqUESTwo different curve fitting techniques were considered during this curve fitting CCR exercise: linear curve fitting and nonparametric curve fitting. The former consists of performing a regression based on a linear combina-tion of explanatory variables and the objective is to choose, or calibrate, the coefficients of the equation, selected polynomial, that minimize the weighted sum of squared of the residuals. The latter method performs a non-par-ametric regression on a local polynomial estimator based on a normal kernel with a bandwidth chosen based on the mean absolute difference between obser-vations.

In both approaches the curve fitting is performed and three important steps are followed to ensure the soundness of the fit:

1. Goodness-of-Fit While minimizing the objective function to solve the curve fitting problem the

goodness of fit measures are used to describe how well our equation fits a set of observations. Typical measures are: R-squared, standard error and estimate regret. Other very useful statistical hypothesis tests such as Kolmogorov-Smirnov and Chi-squared test are commonly used to test normality of the residuals.

2. In-Sample In-Sample consists of valuing the curve fitted equation under the scenarios

that were used to calibrate it. After being comfortable with our goodness of fit statistics, we proceed to analyze our In-Sample. If the in-sample is bad, it would be our first clue for poor forcasting performance in the out-of-sam-ple. In the context of CCR, good In-Sample results means that we could use our fitted equation to compute measures such as Expected Exposure (EE) and Potential Future Exposure (PFE).

3. Out-of-Sample Out-of-Sample consists of revaluing the curve fitted equation under some

scenarios (i.e. Stress Test) other than the ones that were used to calibrate the equation. Having a good In-Sample fit does not necessarily mean that the model will perform good Out-of-Sample. This is a very common case with nonparametric approaches. For example, when using a normal Kernel with global bandwidth calibration the fitted local polynomial curve will have, by construction, very good In-Sample fitting but very poor performance Out-of-Sample. A supervisor will be more interested in the performance Out-of-Sample since the supervisor is interested in assessing what would be the exposure under different stress test cases. The Expected Exposure and PFE is something that the IMM model of a firm should be able to compute with reliable accuracy.

RiSk REdUCTiOnWhen the exposure to a counterparty is a function of a few risk factors the curve fitting exercise is fairly easy. However, as the number of risk factors grows the curve fitting problem becomes more complicated. Firstly, because the dimensionality is high and it would be difficult to construct functions with a high number of risk factors, this will mean having very long equations that are difficult to handle, and secondly and more important, a multicollinearity problem usually arises. In mathematical terms linear curve fitting is reduced to solving a system of simultaneous linear equations and solving it becomes complicated when high correlation between the risk factors is present.

A way to overcome these issues is by using risk reduction techniques such as Principal Component Analysis (PCA) or Portfolio PCA. PCA is applied to the Variance Covari-ance matrix (VCV) of the risk factors log-returns and the idea is to first convert the correlated risk factors into abstract uncorrelated components and then reduce dimensionality by selecting the components that explain most of the variability. Curve fitting can then be performed only on the relevant principal components. Both PCA and Portfolio PCA use the same methodology to compute the principal components; the difference relies on the way they rank the components. In a normal PCA approach all the components have the same weight while in a Portfolio PCA approach the components are selected based on the portfolio sensitivity to each factor, con-tributing to portfolio risk. It is important to mention that the exposure to different factors varies across time as the portfolio matures and this is why we suggest one of the requirements of the firm is to provide the supervisor with first-order approxi-mations to risk sensitivities.

Yaacov Mutnikas at the time of writing was the Head of Risk Architecture in the Risk Specialist Division, Financial Services Authority.

The views and opinions expressed by the authors are theirs alone, and do not necessarily reflect the views and opinions of any organization or professional affiliation.

FURThER REAdingThink Magazine, November 2011: Optimal Granularity, A Supervisory Deep Dive on Counterparty Credit Risk.

“Principal Component Analysis in Quasi Monte Carlo Simulation”, Alexander Kreinin, Leonid Merkoulovitch, Dan Rosen and Michael Zerbs

“Modelling Stochastic Counterparty Credit Exposure for Derivatives Portfolios“, Ben De Prisco, Dan Rosen

“Understanding Financial Linkages: A Common Data Template for Global Systemically Important Banks” Financial Stability Board.

37

Through the Looking Glass

Page 40: Algo think0612-june12

By David Bester

the social meDiaworlDand what risk can learn from it

38

Page 41: Algo think0612-june12

39

Page 42: Algo think0612-june12

lESSOn #1: STAy lOOSEMartin Thomas is a UK-based freelance marketing consultant and writer with an extensive background in advertising, PR, sponsorship and digital media. One of his particular interests is organizational culture and how social media influences it. This is a main theme in his latest book, Loose.

In the book’s early chapters, Martin cites an example of how organizations have moved away from empowering workers towards prescriptive behaviors. Nordstrom, the Seattle-based department store, developed a unique welcome pack-age for new hires. For 25 years the entire employee manual was printed on a small card that contained one rule: “use good judgment in all situations”. This iconic message is still part of the in welcome package, but is now accompanied by a thicker handbook of rules and regulations.

“Over 20 years, organizations have allowed this bureau-cracy of tightness to develop, where people are asked to fill in ever more forms, and we’re harboring ever more compliance officers in order to feed this huge, tight machine that’s built up. And I started to wonder – how does this machine make business any better? What have the benefits been?” asks Martin.

Over the course of his research, Martin observed that ‘tight’, process-driven organizations – ones that are heavily bureaucratic and process-driven – are ill equipped to respond quickly to new events. This inflexibility is made even clearer when contrasted against social media’s ability to impact political, social, and commercial agendas without formal leadership. This influence is driving new behaviors and expectations that cannot be met by organizations operating with an illusion of certainty.

“risk can Be mitigateD, But what you tenD to finD is that the organizations that Deal with risk Best have the aBility to improvise.”

B y now you’ve likely watched or heard of “Kony 2012”. This short film is part of an online campaign to have Joseph Kony arrested and trans-ferred to the International Criminal Court for outstand-

ing warrants of war crimes. Within six days, the video surpassed 100 million views. This is the fastest trip for a video to the 100 million milestone, eclipsing Susan Boyle’s famous appearance on Britain’s Got Talent by three full days.

In the context of social media cam-paigns and viral videos, “Kony 2012” is an unqualified smash. An item appeared out of nowhere and through the ability of people to tweet, like, rebroadcast and embed the video on blogs and profiles, it quickly gained global attention. In finance, this type of unexpected event would more likely be viewed as a systemic threat.

With its seemingly effortless ability to transparently handle millions of pieces of data pulled from multiple platforms, social media seems well equipped to handle elements that financial risk management struggles to achieve. TH!NK connected with three thought leaders to discuss the impact and structure of social media, and to learn how their insights can be leveraged for the offline world.

40

TH!NK JUNE 2012

Page 43: Algo think0612-june12

“The danger is complacency,” he explains. “Risk can be mitigated, but what you tend to find is that the organizations that deal with risk best have the ability to improvise. And the ability to improvise is not corre-lated with the ability to fill out forms.”

Discussions of corporate culture and empowerment have existed for decades. But today new technologies and complex-ities in the workplace are drivers forcing organizations to confront structural issues directly. Social media platforms in partic-ular highlight how being tight or inflexible is incompatible with the expectations of digital delivery.

“Rules and guidelines, written with the very best of intentions, led one organization to take 10 days to approve and issue a 140-character tweet. It’s ridiculous when you think of the timescale, but this is a byproduct of a tight corporate structure going through its practices.”

In tight organizations, decision makers seeking to simplify operations through rule making may be setting themselves up to fail. In an enterprise structure, it is impossible to micromanage every last detail. In these situations judgment and trusting your team can produce far more successful outcomes, particularly when the abilities to improvise and operate in close to real time are highly valued.

“Managers in tight organizations have grown terrified about judgment. Yet, having spent a lot of time looking at organ-izations that seem to be thriving in our world, and the ones leading that charge are not characterized by the largest budget or the best technology, but rather by a loose mindset.”

In Loose, Martin argues that the future of business is letting go. It may be tougher than being in a tight organization, but a loose structure is better equipped to sup-port collaboration, ambiguity, and living with uncertainty.

“If you don’t trust your back office, and if you’re constantly layering on improve-ments, it tightens everything up and slows you down. Building systems where you have kept the right people, trained them properly, trust them to make deci-sions and don’t micromanage them to the

nth degree is critical. It prepares people for success by admitting an important truth: the 21st century is not a place for tidy minds.”

lESSOn #2: CAPTURE EVERyThing nATiVElyEarly corporate adopters to online platforms found themselves without a standardized compliance framework. One of the specific challenges with social media is that unstructured content like tweets and videos may not have the space or medium considerations to support regulatory and legal disclosures.

Tom Chernaik, an expert in social media, marketing, and law, set out to solve this problem. Tom is the CEO and founder of CMP.LY, a company dedicated to simpli-fying the compliance and disclosure process across digital media channels.

“It doesn’t matter whether you only have 140 characters, or unlimited bandwidth for a post or a video clip. At an enterprise level there is an expectation that what you post is in compliance with certain guide-lines,” says Tom.

Within their first day, CMP.LY came up with a very basic coding framework. Using a set of easily identifiable icons and URLs, CMP.LY provides a standardized way to communicate disclosures across platforms like LinkedIn, Facebook and blogs. When companies want to live tweet investor calls, they can use CMP.LY to indicate safe harbor statements along with any kind of forward-looking disclo-sures. A unique trackable code is used to link any information the company shares with a safe harbor statement or equivalent that in-cludes all the relevant statements.

In 2007, a year after Twitter launched, FINRA (Financial Industry Regulatory Authority) issued a guid-ance requiring brokers to retain all forms of electronic communication. Initially companies responded by taking screen shots of a tweet or blog post, a practice Tom sees is still in use, and one he is actively trying to help companies overcome.

“What people don’t realize is that a screen-shot of a social media engagement fails to capture any of the relevant informa-tion or metadata kept around the original post. And then, this screen shot is filed away somewhere and it becomes a horrific, manual process later on not only to locate the image, but to try and reverse engineer where it came from, who authored it, and what related posts it should be connected to for context. With the volume of online data and the timescales demanded by real time interaction, this isn’t practical. There are much more elegant solutions available.”

Social media conversations, like collateral and trade agreements, contain high volumes of detailed information. When a different language or platform is used to document the item’s metadata, the harder it is to recreate all the relevant threads. Capturing all relevant information natively, ideally during the original posting or inputting phase, can dramatically enhance the auditable trail required for tracking and reporting purposes.

41

The Social Media World

Page 44: Algo think0612-june12

As social media and digital compliance evolve, agencies like FINRA, the SEC and the Office of Fair Trading in the UK will develop best practices and guidelines about what is and isn’t acceptable in the digital media sphere. Tom is confident this will occur, but doesn’t think a lack of prescriptive clarity should prevent organ-izations from moving into the social world.

“Let’s admit that, for the foreseeable future, this is a space that is not going to be extremely well defined. Coming up with your own definitions of reasonable monitoring and taking ownership over your programs is the best way to manage the risks in this space, yet still take advantage of the enormous potential.”

lESSOn #3: FOCUS On ThE hAySTACkCrimson Hexagon is a company that uses statistical theory to analyze unstruc-tured text. The company’s technology was developed by Gary King, the company’s co-founder, Chief Scientist, and a Professor at Harvard University.

“The way that human beings understand large amounts of documents, either social media posts or financial transactions, is that we put them into categories. Then we try to figure out how many are in each category,” Gary says.

When you enter a search on Google, their engine takes all the world’s websites and classifies them into two categories based on your query: the ones they are going to show you, and everything else. Gary estimates that the initial return provides something you’re interested in on the first page about 60% of the time, and for this type of search it has proven sufficient. Unfortunately this method doesn’t translate well into searches for the unstructured data of social media.

“Imagine you’re searching for opinions on a new consumer product feature, and you estimate that 10% of posts will reflect a negative response. But our assumption

“the way that human Beings unDerstanD large amounts of Documents, either social meDia posts or financial transactions, is that we put them into categories. then we try to figure out how many are in each category.”

of 60% accuracy also means we assume 40% inaccuracy. Instead of the 10% estimate you started with, the truth could be 50%. You soon realize you are gathering useless information.”

Gary and his team used every computer science method they could get their hands on to try and stack posts into different categories. They determined that available classifiers didn’t work at all, and that they didn’t really want to focus on classifying any individual post. What Gary and his team created is a method to estimate the percent in a category without requiring individual classification. By changing the approach, they have radically changed the outcomes as well.

Crimson Hexagon can tell you the fraction of opinions about a company, or whatever the set of categories selected, in social media posts. “We can’t classify responses, but if you’re interested in understanding what conversations are taking place online, we can give better estimates than anybody. We don’t care about the needle in the haystack. We care about the haystack.”

In an attempt to enhance their analytics, many companies have tried to create code that anticipates each potential variable. Gary refers to this approach as the world’s largest ‘if’ statement. For social media, this may include classifying a statement as positive unless it ends with ‘not’ or a smiley face, indicating sarcasm. “You can imagine writing that set of instructions, and that it can work. But it will be tuned so carefully and specifically to one application that by the time you’re done it will be many years into the future and it won’t apply to any other data set. It doesn’t transfer basically.”

Gary cites an example of trying to anticipate words that would describe a positive movie review. You may consider spectacular, sensational, great or best as obvious choices. As it turns out, one of the words that best predicts a positive movie review is ‘still’.

42

TH!NK JUNE 2012

Page 45: Algo think0612-june12

“There’s no logical reason to predict that: it just turns out movie reviewers like to write things like, ‘the direction was lacking and the script was horrible, but still...” Instead of building ‘if’ statements, Crimson Hexagon’s technology assembles information from unstructured text, sometimes audio and video, and many types of proprietary information streams into categories – so that humans can focus on what we do best, which is reading and understanding.

“The qualitative information that people pore over, such as analyst reports and company reports, and any kind of conver-sations taking place within organizations, are incredibly valuable. It turns out we can understand and analyze these sources in a systematic way now.”

As an industry, financial services have been slow to embrace social media. While more firms are developing a digital presence each year, compliance concerns, a constantly changing landscape, and uncertainty about ROI remain barriers to entry. This doesn’t prevent us however from noticing some similarities between the technological evolution of both industries.

In both finance and digital media, new technologies enabled global networks to develop. An array of different platforms emerged, some new and some built upon legacy systems. The evolution of these data streams present complications for measurement, compliance, and acquiring actionable insights.

From the earliest days of their brief his-tory, online platforms have enabled users to communicate with each other in or near real time, and share detailed information in a variety of formats. There is a growing expectation for this type of access within financial services and society in general. Staying loose, capturing information natively, and focusing on the big picture are three areas organizations can explore to reconstruct successful principles from the social world.

43

The Social Media World

Page 46: Algo think0612-june12

S tochastic & scholastic

Assets, liabilities, and the interconnectivity of risk

By Andy Aziz

44

Page 47: Algo think0612-june12

S tochastic & scholastic

Benchmarks are a fact of life. We measure everything in our lives against expectations and past experience. For asset managers,

the traditional benchmark has been external. The idea was that if

you invested in equities, the more you exceeded external market

performance the more likely you were to beat liabilities.

An evolving approach is to have mandates linked more explicitly to

the liabilities of the institutional investor. Liability-driven investment

concepts aren’t new, but only recently have technological advances made

it feasible to capture the interconnected risks of assets and liabilities.

Assets, liabilities, and the interconnectivity of risk

By Andy Aziz

45

Page 48: Algo think0612-june12

Mike Earley is an insur-ance investment strat-egist at Deutsche Insur-ance Asset Management, one of the world’s leading third-party insurance asset man-

agers. In his view, interconnectivity already exists; asset managers are now simply in a position to do something about it.

“Today’s insurance products are more varied and complex than traditional term life policies. Consider a variable annuity, where an insurer may make a guarantee about the value of that annuity. Liabilities become tied to capital markets, and intercon-nectivities just shoot through the roof.”

TH!NK spoke with Mike about intercon-nectivity, the nature of liability-driven mandates, and how a common framework can help connect the worldviews of insurance and investment.

FORCES OF ChAngEComputational power has provided a foundation for new thinking in asset liability management. Stochastic analysis, which enables thousands of scenarios to be cal-culated on a regular basis, is a particularly valuable tool. “The availability of stochas-tically driven approaches is an exciting development, because it allows for aca-demic techniques that have existed for a while to be put into practice. Specifically, they allow us to do large scale simulations involving both assets and liabilities.”

Large scale simulations such as those using replicating portfolios are utilized to identify risk factors common between assets and liabilities within a consistent framework. Through the use of replicating portfolios, capital market instruments can serve as models for liability exposures. Capital market simulation software, which is fairly robust, is then applied towards analyzing liability implications.

These advanced calculations can help organizations to better identify shared risks. Their use is also supported by outside groups who also stand to gain from more precise statistical measures.

“In the accounting world, there is a growing interest to describe financial results more stochastically, and with statistically-driven approaches. I’d say this is in its initial phases. You can see a similar interest developing among financial industry regulators and rating agencies. These groups both have expectations that companies have an enter-prise level view of risk, and stand to benefit when these organizations provide the most accurate picture of risk they can.”

TimE And dURATiOnFinancial markets, which are statistically driven, can’t be integrated easily with models built to anticipate policy holder behavior. The latter group is dominated by individuals who have much more complex and much more difficult to define motiva-tions. As a result, different tools have developed for each side of the business.

“Traditionally the way that actuaries think about the world, and the way that financial engineers think about the world had developed along parallel paths,” observes Mike. “The challenge for insurance companies is that the finance and actuarial teams don’t necessarily use the same language. Some of the significant differences in these two worlds can be addressed through new tools and approaches.”

One such tool is duration. Going back in time, duration has been the default approach for asset liability matching. As a condition, duration is most appropriate for closed form type of equations, and is relatively simple to calculate. For certain business lines it remains applicable. An enhancement on average duration, key rate duration (or partial duration) is a more in-depth approach that identifies the impact of changes across the yield curve, rather than just movements in the average rate. As a measure of interest rate sensitivity for a portfolio or security, key rate duration holds all other variables constant while changing the market rate for each specific maturity point on the yield curve, to measure sensitivity.

The challenge emerges when companies try to address product designs where policyholders can make choices and exercise options. For these products, insurers apply replicating portfolio technologies, which encapsulate more complex models that attempt to value this policyholder optionality. “Black-Scholes is probably the most famous, binomial pricing models being some others. While those equa-tions do a good job of predicting movements in capital markets, they aren’t easily translated into the insurance marketplace. They require significant processing power to calculate results under large scale simulations, and rely on assumptions about option holder information and motivation.” When the processing chal-lenges can be overcome however, replicating portfolios are a particularly valuable approach

Computational power has provided a foundation for new thinking in asset liability management.

46

TH!NK JUNE 2012

Page 49: Algo think0612-june12

when optionality, an inherent component of many insurance products, enters the picture.

OPTiOnS And OPPORTUniTiESThe evolution of insurance, particularly with life, has led to a focus on retirement planning products. Variable annuities carry a signifi-cant investment component with strong ties to equity markets and fixed income markets. Additionally, the policyholder has the ability to exercise a number of options embedded in the product. The consequence of this option-ality is a mushrooming environment of poli-cyholder decisions that need to be addressed and anticipated.

“Recent swings in equity markets illustrate how many of these options turned from considerations to problems. And as soon as they became problems, something had to be done to address them. This is how optionality has become an issue that has attracted so much attention and is one of the driving factors behind the need for improve-ments in asset liability management.”

Policy options tend to become problems when large movements take place in capital markets. And in order to capture these large markets, closed form calculations, which are best suited to narrow movements, lose their effectiveness.

“You don’t need large scale simulations to examine what happens if the S&P 500 goes up or down 1%. But you can’t use small scale or closed simulations to examine what happens when markets move 10%, 20% or 30%,

since options may be triggered leading to non-linear, discontinuous results. The better large scale simulations perform relative to closed form calculations, the more they are needed – and the more critical they become to measure and understand potential outcomes.”

FACTORS And APPROAChESOn the investment side, the study of capital markets has the benefit of working with a rich history of pricing. Historical pricing data can be used to identify relationships between prices and economic activity and are useful in statistically testing theories. On the insurance side, a smaller population of data exists. Working with a much smaller pool of historical data, actuarial science has evolved specific techniques to extract insights to tease out policyholder behavior.

A Bond Simulation Approach and a Factor Model Approach are two methods that can be leveraged to connect both sides of the organization.

The way to think about a Bond Simulation Approach is to think about anticipating cash flows for assets and liabilities. In order to do this, you have to be able to model cash flows for assets and liabilities. For assets (generally bonds), cash flow is a result of coupon payments and maturities, and in the case of structured securities like mortgage-backed securities,

prepayments/paydowns. On the liability side, these approaches work best when cash flows are driven primarily by non-capital market events (mortality and morbidity), and what policyholder surrenders might be if rates change.

This type of modeling is most useful when applied to contractual exchanges of cash flows, which includes most life products and fixed income investments. For life companies, this technique covers many of their core analysis needs. For products that fall outside these categories, and property casualty companies for example, the same tie-in doesn’t exist. In these cases, a Factor Model Approach is more applicable.

“In the Factor Model Approach, you look at the world and identify what the risks are to your business. These risks might be changes in GDP, employment levels, interest rates, inflation, and so on. Changes in asset and liability values caused by variations in these factors are analyzed, so that you can create a mathematical model of assets and liabil-ities. This model can be used to equate changes in those risk factors with changes in asset and liability market values (the net of the two being surplus). You end up with a common framework and a portfolio of models to estimate changes in values of both assets and liabilities simultaneously.”

47

Stochastic and Scholastic

Page 50: Algo think0612-june12

“The black box approach can’t really be effective with the nature of insurance liabilities. Our work requires ongoing judgment and discretion.”

48

TH!NK JUNE 2012

Page 51: Algo think0612-june12

REAChing FOR ThE WREnChBoth bond simulation and factor model approaches can be used as underlying models to represent assets and liabilities through large scale simulation. However they also depend on cross-communication and cross-education to ensure teams on the investment side are talking to teams on the insurance side.

Developing a shared vision is a crucial component of any risk framework. The more advanced you want the coordination between assets and liabilities to be, the more work needs to be done to ensure that a common language exists, and that the approaches are being used in a consistent manner. One of the barriers to developing this common language is that even the phrase ‘liability driven investing’ means different things to different people. “When I speak with people who work in more tradi-tional areas of institutional asset manage-ment, with endowments and pension plans and trusts, their idea of what LDI means turns out to be completely different from those of life insurers.”

For a traditional institutional investor who might be worried about pensions, the main challenge is to achieve high rates of return to meet long-term, relatively consistent cash payment obligations. Actuaries focus on what the employment population looks like, what their salaries are going to be, and what the benefit is going to be for defined benefit plans. Their investment objective is to make sure these cash flows will be met.

“Within this group, employees don’t have a large incentive or ability to take their money and run. This reduces optionality. Another assumption is that employee salaries (and resulting pension benefit) will be somewhat tied to inflation. If this group embraces LDI, they will go in a completely different direction than life insurers, for whose liabilities there is a very little inflation component.”

Mike suggests we can take something away from these differences. “It tells us that there are common techniques that can be applied to different fields, but when you look at them individually they aren’t

identical. The factors and approaches we have at our disposal to measure and manage interconnectivity are broad enough to be used by all groups, but it doesn’t mean they would use them the same way, or that all factors are of equal import to all groups. I’d say the toolbox can be the same, but pen-sion funds will grab for the screwdriver, while insurance companies might want to use the wrench first.”

FROm mEASUREmEnT TO mAnAgEmEnTOrganizations could stop at using new technologies and techniques to more effec-tively measure risk. But this is only part of the equation. “Once you understand where your risks lie, the question then becomes,

‘well, what do you want to do about it?’ That’s when you start to incorporate the management side into things, and where the potential benefits get quite exciting.”

Management is all about evaluating choices. To evaluate each new option, an entire measurement process has to be repeated. In a complex, interconnected world, instead of measuring one or two scenarios for each option, and choosing between them to manage your risk, 10, 100 or 1,000 iterations for each option may need to be explored. The ability to do large scale, complex analyses in an efficient process that is scalable and repeatable is the only way to use a risk-based framework to inform management decisions in real time.

“As asset managers, the way we look at management is that options exist in investing, and tools are required to help choose between these options. A risk framework is great for measuring the current level of risk being taken and return expected by a company and their investment portfolio. But, it can also be used to turn a corner and form the basis for evaluating competing investment (and operational/product) strategies and opportunities. Using these models to go beyond measuring and incor-porating them into planning and strategy evaluation is the part we’re pretty excited about, and the area we think is going to be the next area of development. These are the opportunities for a true enterprise risk

framework. The option has always been there, but now that the processing tech-nology is maturing, many companies have teams that are looking to bring this unifying framework into existence.”

The same advances that have helped companies do a better job of measuring and anticipating risk can be used within organizations to manage their business. This doesn’t necessarily change the focus of what asset managers do, or clear a path to new services that aren’t being offered today. But it does allow investment teams to do a better job of tailoring what they do for specific risk profiles, specific clients, and specific organizations.

BEyOnd ThE BlACk BOxThere are shared risks across the balance

sheet. As the interconnectivity between products and capital markets has increased, particularly in the life insurance space, additional computational power has been needed to support the measurement and management of their risk. As this cycle repeats, new ideas that previously existed only in an academic space have been brought into practice by technological advances.

“I picture our current state as a piece of metal sitting on an anvil, and people are banging away at it with hammers. We’re going to see more forging of these concepts and more refinement of these ideas.”

Today, benchmarking is most closely associated with hardware and software development. But the term dates back to the shoemaking industry in the 19th century. Cobblers would place the client’s foot on a

‘bench’ and measure out the appropriate ‘mark’ on which to devise the shoe.

Such a singular approach is ineffective in finance, where different approaches are required to match the most appropriate techniques to different objectives. “The black box approach can’t really be effective with the nature of insurance liabilities. Our work requires ongoing judgment and discretion. Asset liability tools will be just that: one of many tools that companies use to understand their risk exposure and, in turn, use to make good decisions.”

49

Stochastic and Scholastic

Page 52: Algo think0612-june12

Player pieces inspired by heroes of finance. Latest addition: Demi Moore, CRO of the film Margin Call.

Players can use up to four currencies to take advantage of fluctuations in FX markets.

Land on the square, claim your annual bonus! Choose between:a) mid-sized Monetb) Mercedes-Benz SL65 AMGc) case of Romanée Conti wine

Forgot anniversary. Go directly to jail.

Your latest hire hacks back o�ce systems – spend week with regulators.

Client’s favorite lunch spot fails health inspection. Lose a turn.

Collateral called in by banker.

Page 53: Algo think0612-june12

Not all risks are worth taking. Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics, we help clients to see risk in its entirety. This unique perspective enables financial services companies to mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed decision making through the science of knowing better.

algorithmics.com

Not all risks are worth taking. Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics, we help clients to see risk in its entirety. This unique perspective enables financial services companies to mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed decision making through the science of knowing better.

algorithmics.com

Page 54: Algo think0612-june12

JUN

E 2012

Volume 7June 2012The CVA Desk: Pricing the True Cost of Risk _ P.18

The Optimization of Everything: Derivatives, CCR and Funding _ P.26

Through the Looking Glass: Curve Fitting _ P.32

The Social Media World: What Risk Can Learn From It _ P.38

Stochastic and Scholastic: The Interconnectivity of Risk _ P.44

Back to the

Future Revisiting capital and the bank of tomoRRow

Not all risks are worth taking. Measuring risk along individual business lines can lead to a distorted picture of exposures. At Algorithmics, we help clients to see risk in its entirety. This unique perspective enables financial services companies to mitigate exposures, and identify new opportunities that maximize returns. Supported by a global team of risk professionals, our award-winning enterprise risk solutions allow clients to master the art of risk-informed decision making through the science of knowing better.

algorithmics.com