Portfolio Value at Risk M P Birla Institute of Management Page 1 A Research Report on “Portfolio Value at Risk” A Dissertation Submitted in partial fulfillment Of the requirements for the award of M.B.A Degree of Bangalore University Submitted By GUNJAN SHIKHA Reg. No: 07XQCM6032 Under the Guidance of Prof. Praveen Bhagawan (Internal Guide) M.P.BIRLA INSTITUTE OF MANAGEMENT (Associate Bharathiya Vidya Bhavan) #43, Race Course Road, BENGALURU-560001 2007-2009
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Portfolio Value at Risk
M P Birla Institute of Management Page 1
A Research Report on
“Portfolio Value at Risk”
A Dissertation Submitted in partial fulfillment Of the requirements for the award of
M.B.A Degree of Bangalore University
Submitted By GUNJAN SHIKHA
Reg. No: 07XQCM6032
Under the Guidance of Prof. Praveen Bhagawan
(Internal Guide)
M.P.BIRLA INSTITUTE OF MANAGEMENT (Associate Bharathiya Vidya Bhavan)
#43, Race Course Road, BENGALURU-560001 2007-2009
Portfolio Value at Risk
M P Birla Institute of Management Page 2
DECLARATION
I hereby declare that the research work embodied in this dissertation
entitle
“Portfolio Value at Risk”, An Analytical Study carried out by me under the
guidance and supervision of Prof. Praveen Bhagawan, M.P.Birla Institute of
Management, Bangalore, (Internal Guide) in partial fulfillment of degree of
Master of Business Administration program is my original work.
I also declare that this dissertation has not been submitted to any
University/Institution for the award of any Degree/Diploma, fellowship or other
similar title or prizes.
Place : Bengaluru Gunjan Shikha
Date : / / 2009 Reg.No.07XQCM6032
Portfolio Value at Risk
M P Birla Institute of Management Page 3
GUIDE’S CERTIFICATE
I hereby declare that the research work embodied in this dissertation
entitled, “Portfolio Value at Risk” has been undertaken and completed by
GUNJAN SHIKHA bearing registration No.07XQCM6032 is a bonafide work
done carried under my guidance during the academic year 2007-09 in a
partial fulfillment of the requirement for the award of MBA degree by
Bangalore University.
I also certify that she has fulfilled all the requirements under the covenant
governing the submission of dissertation to the Bangalore University for the
award of MBA degree.
Place: Bengaluru Prof. Praveen Bhagawan
Date : / / 2009 Faculty , MPBIM
Portfolio Value at Risk
M P Birla Institute of Management Page 4
PRINCIPAL’S CERTIFICATE
I hereby certify that this dissertation is an offshoot of the research work
undertaken and completed by Miss Gunjan Shikha under the guidance of
Prof. Praveen Bhagawan, MPBIM, Bangalore (Internal Guide).
I also declare that this dissertation has not been submitted to any
University/Institution for the award of any Degree/Diploma, fellowship or other
similar title or prizes. Place: Bangalore Dr. Nagesh S Malavalli
Date : / / 2009 Principal, MPBIM
Portfolio Value at Risk
M P Birla Institute of Management Page 5
ACKNOWLEDGEMENT
It is my great pleasure to take this opportunity to thanks all those who
helped me directly or indirectly in the preparation of this research report. I am
happy to express my deep sense of gratitude to my internal guide Prof.
Praveen Bhagawan for his enormous guidance and assistance. He has been
my mentor and guide, his continuous encouragement and valuable suggestions
helped me at every stage of this project.
I would also like to express my thanks to Dr. Nagesh S Malavalli,
Principal, M.P Birla Institute of Management, Bangalore and I am also
thankful to the entire teaching faculty for having given me their valuable
guidance for preparing this research report successfully.
A special thanks to my friends and family for their encouragement and
help in completion of the study successfully.
Finally, I pray to the almighty to bestow upon me success and
progressing in my endeavor.
Gunjan Shikha
Reg. No. 07XQCM6032
Portfolio Value at Risk
M P Birla Institute of Management Page 6
CONTENTS
S. No.
Chapters Page No.
1. Theoretical Background 12
1.1 Background 13-15
1.2 Value at Risk (VaR) 16-17
1.3 Mechanics of VaR Estimation 17
1.4 Steps in Constructing VaR 18
1.5 VaR and Confidence Levels 19-20
1.6 Identifying the Important Market Factors 21
1.7 VaR Methods 22
(i) Analytic Method 23-24
(ii) Historical Simulation Method 25-26
(iii) Monte Carlo Simulation Method 27-30
1.8 Review of Literature 31-37
2. Research Design 38
2.1 Statement of Problem 39
2.2 Research Objectives 39
2.3 Hypothesis 40
2.4 Scope of Study 40
2.5 Research Methodology 40-41
2.6 Limitations 41
Portfolio Value at Risk
M P Birla Institute of Management Page 7
2.7 Chapter Scheme 41-43
3. Industry Profile 44
3.1 Equity 45-48
3.2 Bonds 49-50
3.3 Currencies 51-53
4. Analysis and Interpretation 54
4.1 ADF Tests for Equities, Bonds and Currencies 55-60
4.2 Historical Simulation 60-73
4.3 Analytic Approach 74-88
4.4 Single Asset Case 89-90
4.5 Two Asset Case 91-92
4.6 Monte Carlo Simulation 92-93
4.7 Value at Risk for Portfolio 94-96
5. Findings and Conclusion 97
5.1 Findings 98
5.2 Conclusion 99-100
Portfolio Value at Risk
M P Birla Institute of Management Page 8
5.3 Recommendations 101
Selected Bibliography 102
Annexure 103-128
Portfolio Value at Risk
M P Birla Institute of Management Page 9
List of Tables
S. No.
Chart & Tables Page No.
1. Chart 1: Historical Simulation 61
2. Table 1 : Equity ADF Test 47
3. Table 2 : Bonds ADF Test 49
4. Table 3 : Currency ADF Test 51
5. Table 4 : Data for calculation of VaR through Historical Simulation
54-59
6. Table 5 : Calculation of Daily Stock volatility 63-68
7. Table 6 : Simulated Index 77
8. Table 7 78
9. Table 8 79
Portfolio Value at Risk
M P Birla Institute of Management Page 10
EXECUTIVE SUMMARY
What is the most I can lose on this investment? This is a question that
almost every investor who has invested or is considering investing in a risky
asset asks at some point in time. Financial institutions and corporate Treasuries
or individuals require a method to become aware of their risk and also require
the mechanism that can be scientifically rigorous. Optimal allocation of a given
capital between different available competing assets is a standard problem
which any fund manager faces. Hence given an IBM and a CISCO stock a fund
manager would have to decide how much money to allocate to each. The
decision would depend on the risk appetite of the person whose money is being
managed by the fund. We are considering Value at Risk, popularly known as
VaR, as a measure of risk. Value at Risk tries to provide an answer, at least
within a reasonable bound. In fact, it is misleading to consider Value at Risk, or
VaR as it is widely known, to be an alternative to risk adjusted value and
probabilistic approaches. After all, it borrows liberally from both. However, the
wide use of VaR as a tool for risk assessment, especially in financial service
firms, and the extensive literature that has developed around it, push us to
dedicate this report to its examination.
We begin with a general description of VaR and the view of risk that
underlies its measurement, and examine the history of its development and
applications. We then consider the various estimation issues and questions that
have come up in the context of measuring VaR and how analysts and
researchers have tried to deal with them, is discussed under Review Literature.
Next, we evaluate about the research design, where we elaborate about the
objectives, data and its source and chapter scheme of the research report. Then
in industry profile, a brief is about the equities, bonds and currencies which is
Portfolio Value at Risk
M P Birla Institute of Management Page 11
selected for research report. Next, we calculate VaR using VaR models i.e.
Analytic Method, Historical Simulation and Monte Carlo Simulation and then
constructing portfolio and calculating VaR. In the final section, we focus on
Research findings, conclusion and recommendations on the research report.
Portfolio Value at Risk
M P Birla Institute of Management Page 12
CHAPTER 1
THEORITICAL BACKGROUND
Portfolio Value at Risk
M P Birla Institute of Management Page 13
1.1 Background
The concept and use of VaR is recent. Value-at-Risk was first used by
major financial firms in the late 1980s to measure the risks of their trading
portfolios. Since that time period, the use of Value-at-Risk has exploded. While
the term “Value at Risk” was not widely used prior to the mid 1990s, the origins
of the measure lie further back in time. The mathematics that underlies VaR
were largely developed in the context of portfolio theory by Harry Markowitz
and others, though their efforts were directed towards a different end – devising
optimal portfolios for equity investors. In particular, the focus on market risks
and the effects of the co movements in these risks are central to how VaR is
computed.
Value-at-Risk (VaR) has become one of the most popular risk measures
since it was recommended and adopted by the Bank of International Settlements
and USA regulatory agencies in 1988. The straightforward interpretation of
VaR makes this risk measure an intuitive criterion for asset management
decisions. The VaR concept has also been extended to the portfolio Value-at-
Risk (PVaR) measure used for managing risks and returns under a multiple-
asset portfolio. Although VaR and PVaR are widely used in practice, recent
criticisms have focused on the financial risks firms face if the VaR or PVaR
estimates are based on poor information. One potentially important source of
estimation error is in the assumptions regarding the probability model of asset
returns.
The impetus for the use of VaR measures, though, came from the crises
that beset financial service firms over time and the regulatory responses to these
crises. The first regulatory capital requirements for banks were enacted in the
aftermath of the Great Depression and the bank failures of the era, when the
Securities Exchange Act established the Securities Exchange Commission
(SEC) and required banks to keep their borrowings below 2000% of their equity
Portfolio Value at Risk
M P Birla Institute of Management Page 14
capital. In the decades thereafter, banks devised risk measures and control
devices to ensure that they met these capital requirements. With the increased
risk created by the advent of derivative markets and floating exchange rates in
the early 1970s, capital requirements were refined and expanded in the SEC’s
Uniform Net Capital Rule (UNCR) that was promulgated in 1975, which
categorized the financial assets that banks held into twelve classes, based upon
risk, and required different capital requirements for each, ranging from 0% for
short term treasuries to 30% for equities. Banks were required to report on their
capital calculations in quarterly statements that were titled Financial and
Operating Combined Uniform Single (FOCUS) reports.
The first regulatory measures that evoke Value at Risk, though, were
initiated in 1980, when the SEC tied the capital requirements of financial
service firms to the losses that would be incurred, with 95% confidence over a
thirty-day interval, in different security classes; historical returns were used to
compute these potential losses. Although the measures were described as
haircuts and not as Value or Capital at Risk, it was clear the SEC was requiring
financial service firms to embark on the process of estimating one month 95%
VaRs and hold enough capital to cover the potential losses. At about the same
time, the trading portfolios of investment and commercial banks were becoming
larger and more volatile, creating a need for more sophisticated and timely risk
control measures. Ken Garbade at Banker’s Trust, in internal documents,
presented sophisticated measures of Value at Risk in 1986 for the firm’s fixed
income portfolios, based upon the covariance in yields on bonds of different
maturities. By the early 1990s,
many financial service firms had developed rudimentary measures of Value at
Risk, with wide variations on how it was measured. In the aftermath of
numerous disastrous losses associated with the use of derivatives and leverage
between 1993 and 1995, culminating with the failure of Barings, the British
Portfolio Value at Risk
M P Birla Institute of Management Page 15
investment bank, as a result of unauthorized trading in Nikkei futures and
options by Nick Leeson, a young trader in Singapore, firms were ready for more
comprehensive risk measures. In 1995, J.P.Morgan provided public access to
data on the variances of and co-variances across various security and asset
classes, that it had used internally for almost a decade to manage risk, and
allowed software makers to develop software to measure risk. It titled the
service “Risk Metrics” and used the term Value at Risk to describe the risk
measure that emerged from the data. The measure found a ready audience with
commercial and investment banks, and the regulatory authorities overseeing
them, who warmed to its intuitive appeal. In the last decade, VaR has becomes
the established measure of risk exposure in financial service firms and has even
begun to find acceptance in non financial service firms.
Portfolio Value at Risk
M P Birla Institute of Management Page 16
1.2 VaR
VaR is generally considered as a probability based measure of loss
potential. This definition is very general however and we need something more
specific. More formally, VaR is the loss that would be exceeded with a given
probability over a specified period of time. This definition has three important
elements. First, we see that VaR is a loss that could be exceeded. Hence, it is a
measure of a minimum loss. Second, we see that VaR is associated with a given
probability. Thus, we would state that there is a certain percent chance that a
particular loss would be exceeded with a given probability. Finally, VaR is
defined for a specific period of time. Therefore, the loss that would be exceeded
with a given probability is a loss that would be expected to occur over a
specified time period. Consider the following example of
VAR for an investment portfolio: The VaR for a portfolio is Rs. 15 million for
one day with a probability of 0.05. Consider what this statement says: There is a
5 percent chance that the portfolio will lose at least Rs. 15 million in a single
day. The emphasis here should be on the fact that the loss is a minimum, not a
maximum. Value at risk is a statistic that summarizes the exposures of
an asset or portfolio to market risks. VaR allows managers to quantify and
express risk. In other words, VaR is a measure of the maximum potential
change in the value of a portfolio of financial instruments with a given
probability over a pre-set horizon.
Thus, the value of VaR depends on: -
The Horizon over which the portfolio's change in value is measured.
The degree of confidence chosen for the measurement.
VaR is often considered a useful summary measure of market risk for
several reasons. One feature of VaR is its consistency as a measure of financial
Portfolio Value at Risk
M P Birla Institute of Management Page 17
risk. VaR facilitates direct comparison of risk across different portfolios and
distinct financial products. Also it allows the managers or investors to examine
potential losses over a particular time horizon with which they are concerned.
Another relative advantage of is that it is largely tactical neutral. In other words,
VaR is calculated by examining the market risks of the individual instruments in
a portfolio, not using actual historical performance.
1.3 Mechanics of VaR Estimation
Establishing a VaR measure involves a number of decisions. Two
important ones are the choice of probability and the choice of the time period
over which the VaR will be measured. Once these parameters are chosen, one
can proceed to actually obtain the VaR estimate. The mechanics of VaR
estimation can be described as a 5-step process, which is explained with the
help of an example.
Portfolio Value at Risk
M P Birla Institute of Management Page 18
1.4 Steps in Constructing VaR
Assume, for instance, that we need to measure the VaR of Rs.500 cr
equity portfolio over 10 days at the 99 percent confidence level. The following
steps are required to compute VaR: -
Mark-to-market of the current portfolio (e.g., Rs. 100 cr)
Measure the Variability of the risk factors(s) (e.g., 15 % annum)
Set the time horizon, or the holding period (e.g., adjust to 10 business
days)
assuming a normal distribution)
Report the worst loss by processing all the preceding information (e.g., a
Rs. 7 cr VaR)
This is a very simple method of calculating VaR for a given portfolio but
in reality the calculation of VaR for general, parametric and other complex
distribution is more complicated and different methods are used for calculating
VaR which are explained in detail in the subsequent part of the report.
Portfolio Value at Risk
M P Birla Institute of Management Page 19
1.5 Value-at-Risk and Confidence levels
A more risk averse manager will want to determine VaR with greater
confidence -
Increasing the confidence level will increase VaR.
Decreasing the confidence level will decrease VaR.
Portfolio Value at Risk
M P Birla Institute of Management Page 20
Portfolio Value at Risk
M P Birla Institute of Management Page 21
1.6 Identifying the important market factors
In order to compute VaR (or any other quantitative measure of market
risk), we need to identify the basic market rates and prices that affect the value
of the portfolio. These basic market rates and prices are the “market factors”. It
is necessary to identify a limited number of basic market factors simply because
otherwise the complexity of trying to come up with a portfolio level quantitative
measure of market risk explodes. Even if we restrict our attention to simple
instruments such as forward contracts, an almost countless number of different
contracts can exist, because virtually any forward price and delivery date are
possible. The market risk factors are inherent in most other instruments such as
swaps, loans, options, and exotic options of course are ever more complicated.
Thus, expressing the instrument’s values in terms of limited number of basic
market factors is an essential first step in the problem manageable. Typically,
market forces are identified by decomposing the instruments in the portfolio
into simpler instruments more directly related to basic market risk factors, and
then interpreting the actual instruments as portfolios of the simpler instruments.
Portfolio Value at Risk
M P Birla Institute of Management Page 22
1.7 VaR Methods
There are three different methods for calculation of VaR namely: -
i. Analytic Method
ii. Historical Method
iii. Monte Carlo Simulation Method
i. Analytic Method
The analytic method follows the variance/ covariance approach, which
uses historic volatility and correlation data to predict the way markets are likely
to move in future. By assuming that underlying market factors follow normal
distribution, the VaR estimate can be calculated analytically for any confidence
interval.
There are essentially two types of analytic method: -
Delta-Normal Method : This method involves linear approximation of
the price changes. It is mainly suitable when the portfolio does not
contain non-linear products and when the movements in the risk factors
are small. This method can accommodate a large number of assets and is
simple to implement.
Delta-Gamma Method : This method improves upon the linear
approximation in the Delta-Normal Method by taking into account the
second order term also. However, inclusion of this term skews the
distribution of changes in portfolio values. Hence the simplicity of the
Delta-Normal approach is lost.
Risk metrics methodology is based on the analytic method. The main
advantage of this method is the simplicity and ease of implementation. This
Portfolio Value at Risk
M P Birla Institute of Management Page 23
method is easy to communicate because of standardization. Delta Gamma
method performs well provided the Greeks are stable. Thus, it is not a good
measure of risk for At the money option, Near maturity money options, barrier
options where the price is close to the barrier etc.
Assessment
The strength of the Variance-Covariance approach is that the Value at
Risk is simple to compute, once you have made an assumption about the
distribution of returns and inputted the means, variances and co-variances of
returns. In the estimation process, though, lie the three key weaknesses of the
approach:
Wrong distributional assumption - If conditional returns are not
normally distributed, the computed VaR will understate the true VaR. In
other words, if there are far more outliers in the actual return distribution
than would be expected given the normality assumption, the actual Value
at Risk will be much higher than the computed Value at Risk.
Input error - Even if the standardized return distribution assumption
holds up, the VaR can still be wrong if the variances and co-variances
that are used to estimate it are incorrect. To the extent that these numbers
are estimated using historical data, there is a standard error associated
with each of the estimates. In other words, the variance - covariance
matrix that is input to the VaR measure is a collection of estimates, some
of which have very large error terms.
Non-stationary variables - A related problem occurs when the variances
and co-variances across assets change over time. This non-stationarity in
values is not uncommon because the fundamentals driving these numbers
do change over time.
Portfolio Value at Risk
M P Birla Institute of Management Page 24
ii. Historical Simulation Method
The historical method identifies a portfolio's exposure to specific market
factors and calculates (say daily) observed changes in these market factors over
the time horizon (say 100 days) to be used in the VaR calculation. The portfolio
is then revalued as if each change occurred from today's levels, thus creating
100 possible changes to the portfolio's value. From these figures, a VaR number
corresponding to a given confidence level is determined. The method is
relatively simple to implement if historical data is easily available. By relying
on actual prices, the method allows non-linearity and non-normal distributions.
It does not rely on specific assumption about valuation models or the underlying
stochastic structure of the market. It accounts for "fat tails" and since it does not
rely on valuation models, it is not prone to model risk. However, the historical
simulation method uses only one path (i.e. the actual past). It also assumes that
the past represents the immediate future fairly. This method may miss situations
with temporarily elevated volatility. Further, the method puts the same weight
age on all observations in the window, including old data points. Thus, the
measure of risk change significantly after an old observation is dropped from
the window. Historical simulation becomes very cumbersome for large
portfolios with complicated structures.
Assessment
While historical simulations are popular and relatively easy to run, they
do come with baggage. In particular, the underlying assumptions of the model
generate give rise to its weaknesses:
(a) Past is not prologue – While all three approaches to estimating VaR
use historical data, historical simulations are much more reliant on them than
the other two approaches for the simple reason that the Value at Risk is
computed entirely from historical price changes. There is little room to overlay
Portfolio Value at Risk
M P Birla Institute of Management Page 25
distributional assumptions (as we do with the Variance-covariance approach) or
to bring in subjective information (as we can with Monte Carlo simulations).
(b) Trends in the data - A related argument can be made about the way
in which we compute Value at Risk, using historical data, where all data points
are weighted equally. In other words, the price changes from trading days in
1992 affect the VaR in exactly the same proportion as price changes from
trading days in 1998. To the extent that there is a trend of increasing volatility
even within the historical time period, we will understate the Value at Risk.
(c) New assets or market risks - While this could be a critique of any of
the three approaches for estimating VaR, the historical simulation approach has
the most difficulty dealing with new risks and assets for an obvious reason:
there is no historic data available to compute the Value at Risk. Assessing the
Value at Risk to a firm from developments in online commerce in the late 1990s
would have been difficult to do, since the online business was in its nascent
stage.
Modifications
As with the other approaches to computing VaR, there have been
modifications suggested to the approach, largely directed at taking into account
some of the criticisms mentioned in the last section.
(a) Weighting the recent past more - A reasonable argument can be
made that returns in the recent past are better predictors of the immediate future
than are returns from the distant past. Boudoukh, Richardson and Whitelaw
present a variant on historical simulations, where recent data is weighted more,
using a decay factor as their time weighting mechanism. In simple terms, each
return, rather than being weighted equally, is assigned a probability weight
based on its recency. In other words, if the decay factor is 0.90, the most recent
observation has the probability weight p, the observation prior to it will be
Portfolio Value at Risk
M P Birla Institute of Management Page 26
weighted 0.9p, the one before that is weighted 0.81p and so on. In fact, the
conventional historical simulation approach is a special case of this approach,
where the decay factor is set to 1.
(b) Combining historical simulation with time series models - Cabado
and Moya suggested that better estimates of VaR could be obtained by fitting at
time series model through the historical data and using the parameters of that
model to forecast the Value at Risk.
(c) Volatility Updating - Hull and White suggest a different way of
updating historical data for shifts in volatility. For assets where the recent
volatility is higher than historical volatility, they recommend that the historical
data be adjusted to reflect the change.
Portfolio Value at Risk
M P Birla Institute of Management Page 27
iii. Monte – Carlo simulation
The Monte-Carlo simulation methodology has a number of similarities to
historical simulation. The main difference is that rather than carrying out the
simulation using the observed changes in the market factors over the last N
periods to generate N hypothetical portfolio profits and losses, one chooses a
statistical distribution that is believed to adequately capture or approximate the
possible changes in the market forces. Then, a pseudo-random number
generator is used to generate thousands or perhaps tens of thousands of
hypothetical changes in the market factors. These are then used to construct
thousands of hypothetical portfolio profits and losses on the current portfolio,
and the distribution of profits and losses. Finally, the value-at-risk is determined
from this distribution.
General Description
The first two steps in a Monte Carlo simulation mirror the first two steps
in the Variance-covariance method where we identify the markets risks that
affect the asset or assets in a portfolio and convert individual assets into
positions in standardized instruments. It is in the third step that the differences
emerge. Rather than compute the variances and co-variances across the market
risk factors, we take the simulation route, where we specify probability
distributions for each of the market risk factors and specify how these market
risk factors move together. Thus, in the example of the six-month Dollar/Euro
forward contract that we used earlier, the probability distributions for the 6-
month zero coupon $ bond, the 6-month zero coupon euro bond and the
dollar/euro spot rate will have to be specified, as will the correlation across
these instruments. While the estimation of parameters is easier if you assume
normal distributions for all variables, the power of Monte-Carlo simulations
Portfolio Value at Risk
M P Birla Institute of Management Page 28
comes from the freedom you have to pick alternate distributions for the
variables. In addition, you can bring in subjective judgments to modify these
distributions. Once the distributions are specified, the simulation process starts.
In each run, the market risk variables take on different outcomes and the value
of the portfolio reflects the outcomes. After a repeated series of runs, numbering
usually in the thousands, you will have a distribution of portfolio values that can
be used to assess Value at Risk. For instance, assume that you run a series of
10,000 simulations and derive corresponding values for the portfolio. These
values can be ranked from highest to lowest, and the 95% percentile Value at
Risk will correspond to the 500th lowest value and the 99th percentile to the
100th lowest value.
Assessment
Much of what was said about the strengths and weaknesses of the
simulation approach in the last chapter apply to its use in computing Value at
Risk. Quickly reviewing the criticism, a simulation is only as good as the
probability distribution for the inputs that are fed into it. While Monte Carlo
simulations are often touted as more sophisticated than historical simulations,
many users directly draw on historical data to make their distributional
assumptions. In addition, as the number of market risk factors increases and
their co-movements become more complex, Monte Carlo simulations become
more difficult to run for two reasons. First, you now have to estimate the
probability distributions for hundreds of market risk variables rather than just
the handful that we talked about in the context of analyzing a single project or
asset. Second, the number of simulations that you need to run to obtain
reasonable estimate of Value at Risk will have to increase substantially (to the
tens of thousands from the thousands). The strengths of Monte Carlo
Portfolio Value at Risk
M P Birla Institute of Management Page 29
simulations can be seen when compared to the other two approaches for
computing Value at Risk. Unlike the variance-covariance approach, we do not
have to make unrealistic assumptions about normality in returns. In contrast to
the historical simulation approach, we begin with historical data but are free to
bring in both subjective judgments and other information to improve forecasted
probability distributions. Finally, Monte Carlo simulations can be used to assess
the Value at Risk for any type of portfolio and are flexible enough to cover
options and option-like securities.
Modifications
As with the other approaches, the modifications to the Monte Carlo
simulation are directed at its biggest weakness, which is its computational
bulk. To provide a simple illustration, a yield curve model with 15 key rates and
four possible values for each will require 1,073,741,824 simulations (415) to be
complete. The modified versions narrow the focus, using different techniques,
and reduce the required number of simulations:-
(a) Scenario Simulation - One way to reduce the computation burden of
running Monte-Carlo simulations is to do the analysis over a number of discrete
scenarios. Frye suggests an approach that can be used to develop these scenarios
by applying a small set of pre-specified shocks to the system. Jamshidan and
Zhu (1997) suggest what they called scenario simulations where they use
principal component analysis as a first step to narrow the number of factors.
Rather than allow each risk variable to take on all of the potential values, they
look at likely combinations of these variables to arrive at scenarios. The values
are computed across these scenarios to arrive at the simulation results.
Portfolio Value at Risk
M P Birla Institute of Management Page 30
(b) Monte Carlo Simulations with Variance-Covariance method
modification – The strength of the Variance-covariance method is its speed. If
you are willing to make the required distributional assumption about normality
in returns and have the variance-covariance matrix in hand, you can compute
the Value at Risk for any portfolio in minutes. The strength of the Monte Carlo
simulation approach is the flexibility it offers users to make different
distributional assumptions and deal with various types of risk, but it can be
painfully slow to run. Glasserman, Heidelberger and Shahabuddin use
approximations from the variance covariance approach to guide the sampling
process in Monte Carlo simulations and report a substantial savings in time and
resources, without any appreciable loss of precision. The trade off in each of
these modifications is simple. You give some of the power and precision of the
Monte Carlo approach but gain in terms of estimation requirements and
computational time.
Portfolio Value at Risk
M P Birla Institute of Management Page 31
1.8 REVIEW LITERATURE
Literature was reviewed with an aim to gain an insight into two major
facets of our problem.
a) VaR Calculation
b) Finding the optimal allocation of VaR calculation methods
VaR is straightforward to estimate and interpret as a measure of risk
exposure, and these advantages often appeal to asset managers (Culp, Mensink,
and Neves 1998). However, most of the current research on Value-at-Risk
(VaR) estimation focuses on the one-dimension (univariate) case. One of the
first attempts to compute PVaR from a model of the joint returns distribution
was reported by Frauendorfer, Moix, and Schmid (1995), but applications of
this method are limited because the PVaR model cannot be stated in closed
form and can only be approximated with complex computational algorithms.
Alternatively, Wang and Wu (2001) use linear combinations of returns
models based on extreme value theory to approximate the tail areas of heavy-
tailed distributions, but this approach may be undesirable because it only
focuses on the lower-tail. The alternative approaches that are currently popular
include the variance-covariance (VC) method, Monte Carlo simulation, delta-
Normal simulation, and historical simulation (HS) (Dowd 1998).
In one of the studies of VaR on the Indian stock market, Varma (1999)
assumes a Generalized Error Distribution (GED) and uses GARCH(GED),
EWMA(GED) and EWMA(RM) models to estimate VaR. The study computes
the nominal coverage, i.e., the ratio of number of exceedences to the total
number of observations, and compares it with the true coverage. The study
Portfolio Value at Risk
M P Birla Institute of Management Page 32
preferred the use of GARCH(GED) model over the other two models on the
basis of the results.
In another study of Nifty and S&P 500, Sarma et al. (2003) used four
models (GARCH, EWMA, Risk Metrics-RM, and Historical Simulation-HS)
and their different variations, on the basis of the number of data points used,
under the assumption of normally distributed errors, to find out that GARCH
and RM fare well, with the latter having a slight edge. They used Back Testing
methods for performance assessment of various models by testing for
conditional and unconditional coverage and independence that were perfected
by Christoffersen (1998) as well as loss functions developed by Lopez (1998).
In a study of the indices of the five of the developed countries, Angelidis
et al. (2004) used three variations of GARCH model (naïve, EGARCH,
TGARCH) and various orders of AR processes on normal GED and t-
distributions. They also tested for conditional and unconditional coverage using
Christoffersen's method, but could not point out any model as the `best' model.
They found student's-t distribution to be capturing the risk better than other
distributions.
Bao et al. (2004) checked the performance of VaR models in terms of
empirical coverage taking parametric and nonparametric models. They used
normal, historical simulation, Monte Carlo simulation non parametrically
estimated distribution and the extreme value distribution along with RM as
benchmark. They analyzed the model performance before, during, and after the
Asian financial crisis. In the pre-crisis period, RM was found to be quite a good
model, with normal not being far behind. Historical Simulation(HS), Non-
Parametric (NP) methods and Monte Carlo (MC) simulation were also seen to
be satisfactory. During the Aisan financial crisis, all the models understated the
VaR numbers, however, the EVT-based one did the best job. The post-crisis
Portfolio Value at Risk
M P Birla Institute of Management Page 33
period results were found to be similar to the pre-crisis period results. From the
study, it is clear that the conventional models do a good job during the normal
periods.
Nath and Samantha (2003) studied the VaR for the Indian banking
system. They used one day return on the Government of India securities as the
variable. The models used were normal, historical simulation, risk metrics and
Hill's estimator. They found that VaR models under variance-covariance/normal
approach, particularly risk metrics, underestimated VaR numbers. The GARCH
(normal) model performed slightly better than the Risk Metrics model. While
HS provided quite reasonable estimates, Hill's estimator overestimated the VaR
numbers as the number of failures was less than theoretical expectation.
Value at Risk Models in the Indian Stock Market
by
Jayanth R. Varma
IIM A
This paper provides empirical tests of different risk management models
in the Value at Risk (VaR) framework in the Indian stock market. It is found
that the GARCH-GED (Generalised Auto-Regressive Conditional
Heteroscedasticity with Generalised Error Distribution residuals) performs
exceedingly well at all common risk levels (ranging from 0.25% to 10%). The
EWMA (Exponentially Weighted Moving Average) model used in J. P.
Morgan’s RiskMetrics® methodology does well at the 10% and 5% risk levels
but breaks down at the 1% and lower risk levels. The paper then suggests a way
of salvaging the EWMA model by using a larger number of standard deviations
to set the VaR limit. For example, the paper suggests using 3 standard
Portfolio Value at Risk
M P Birla Institute of Management Page 34
deviations for a 1% VaR while the normal distribution indicates 2.58 standard
deviations and the GED indicates 2.85 standard deviations. With this
modification the EWMA model is shown to work quite well. Given its greater
simplicity and ease of interpretation, it may be more convenient in practice to
use this model than the more accurate GARCH-GED specification. The paper
also provides evidence suggesting that it may be possible to improve the
performance of the VaR models by taking into account the price movements in
foreign stock markets.
Decomposing Portfolio Value-at-Risk : A General Analysis
Winfried G. Hallerbach
Erasmus University Rotterdam and Tinbergen Institute Graduate School
of Economics
An intensive and still growing body of research focuses on estimating a
portfolio’s Value at Risk. Aside from the total portfolio’s VaR, there is a
growing need for information about (i) the marginal contribution of the
individual portfolio components to the diversified portfolio VaR, (ii) the
proportion of the diversified portfolio VaR that can be attributed to each of the
individual components consituting the portfolio, and (iii) theincremental effect
on VaR of adding a new instrument to the existing portfolio. For many
portfolios, however, the assumption of normally distributed returns is too
stringent. There exist to the best of our knowledge no procedures for estimating
marginal VaR, component VaR and incremental VaR in either a non-normal
analytical setting or a Monte Carlo / historical simulation context. This paper
tries to fill this gap by investigating these VaR concepts in a general
distribution-free setting. We derive a general expression for the marginal
contribution of an instrument to the diversified portfolio VaR whether this
Portfolio Value at Risk
M P Birla Institute of Management Page 35
instrument is already included in the portfolio or not. We show how in a most
general way, the total portfolio VaR can be decomposed in partial VaRs that can
be attributed to the individual instruments comprised in the portfolio. These
component VaRs have the appealing property that they aggregate linearly into
the diversified portfolio VaR. We not only show how the standard results under
normality can be generalized to non-normal analytical VaR approaches but also
present an explicit procedure for estimating marginal VaRs in a simulation
framework. Given the marginal VaR estimate, component VaR and incremental
VaR readily follow. The proposed estimation approach pairs intuitive appeal
with computational efficiency. We evaluate various alternative estimation
methods in an application example and conclude that the proposed approach
displays an astounding accuracy and a promising outperformance.
Value-at-Risk for Fixed Income portfolios : A comparison of alternative
models
Gangadhar Darbha
National Stock Exchange, Mumbai, India
December 2001
The paper presents a case for a new method for computing the VaR for a
set of fixed income securities based on extreme value theory that models the tail
probabilities directly without making any assumption about the distribution of
entire return process. It compares the estimates of VaR for a portfolio of fixed
income securities across three methods: Variance-Covariance method,
Historical Simulation method and Extreme Value method and that extreme
value method provides the accurate VaR estimator in terms of correct failure
ratio and the size of VaR.
Portfolio Value at Risk
M P Birla Institute of Management Page 36
Portfolio Value at Risk Based On Independent Components : Analysis
Ying Chen, Wolfgang H¨ardle1 and Vladimir Spokoiny
Risk management technology applied to high dimensional portfolios
needs simple and fast methods for calculation of Value-at-Risk (VaR). The
multivariate normal framework provides a simple off-the-shelf methodology but
lacks the heavy tailed distributional properties that are observed in data. A
principle component based method (tied closely to the elliptical structure of the
distribution) is therefore expected to be unsatisfactory. Here we propose and
analyze a technology that is based on Independent Component Analysis (ICA).
We study the proposed ICVaR methodology in an extensive simulation study
and apply it to a high dimensional portfolio situation. Our analysis yields very
accurate VaRs.
Evaluating Portfolio Value-at-Risk using Semi-Parametric GARCH
Models
Jeroen V.K. Rombouts and Marno Verbeek
December 2004
In this paper we examine the usefulness of multivariate semi-parametric
GARCH models for portfolio selection under a Value-at-Risk (VaR) constraint.
First, we specify and estimate several alternative multivariate GARCH models
for daily re- turns on the S&P 500 and Nasdaq indexes. Examining the within
sample VaRs of a set of given portfolios shows that the semi-parametric model
performs uniformly well, while parametric models in several cases have
unacceptable failure rates. Interestingly, distributional assumptions appear to
have a much larger impact on the performance of the VaR estimates than the
particular parametric specification chosen for the GARCH equations. Finally,
Portfolio Value at Risk
M P Birla Institute of Management Page 37
we examine the economic value of the multivariate GARCH models by
determining optimal portfolios based on maximizing expected returns subject to
a VaR constraint, over a period of 500 consecutive days. Again, the superiority
and robustness of the semi-parametric model is confirmed.
Value-at-Risk Based Portfolio Optimization
Working Paper by
Amy v. Puelz
Edwin L. Cox School of Business, Southern Methodist University
The Value at Risk (VaR) metric, a widely reported and accepted measure
of financial risk across industry segments and market participants, is discrete by
nature measuring the probability of worst case portfolio performance. In this
paper I present four model frameworks that apply VaR to ex ante portfolio
decisions. The mean-variance model, Young's (1998) minimax model and Hiller
and Eckstein's (1993) stochastic programming model are extended to
incorporate VaR. A fourth model, that is new, implements stochastic
programming with a return aggregation technique. Performance tests are
conducted on the four models using empirical and simulated data. The new
model most closely matches the discrete nature of VaR exhibiting statistically
superior performance across the series of tests. Robustness tests of the four
model forms provides support for the argument that VaR-based investment
strategies lead to higher risk decision than those where the severity of worst
case performance is also considered.Conclusion
Portfolio Value at Risk
M P Birla Institute of Management Page 38
CHAPTER 2
RESEARCH DESIGN
Portfolio Value at Risk
M P Birla Institute of Management Page 39
2.1 Statement of Problem
In volatile financial markets, both market participants and market
regulators need models for measuring, managing and containing risks. Market
participants need risk management models to manage the risks involved in their
open positions. Market regulators on the other hand must ensure the financial
integrity of the stock exchanges and the clearing houses by appropriate
margining and risk containment systems.
How inaccurate VaR or PVaR estimates may lead to redundant amount of
risk capital maintained, which will reduce capital management efficiency as
well as increase the financial risk. Therefore an attempt is made in this study to
find out the right tool to measure Portfolio Value at Risk (PVaR).
2.2 Research Objectives
In this paper, my attempt is to propose an integrated method to compute
PVaR, and also,
to use VaR as a measure of risk of Portfolios
to know the application of various Value at-risk (VaR) Models
convenient to measure the risk of portfolios
to compare the results of various model
to give conclusion regarding the best method to be adopted to determine
Value at Risk
Portfolio Value at Risk
M P Birla Institute of Management Page 40
2.3 Hypothesis
In order to see the stationarity of data collected, Augmented Dickey-Fuller Unit
Root Test has been incorporated where, Hypothesis is
H0 : Null Hypothesis is accepted as data is non-stationary, and
H1 : Alternate Hypothesis is rejected as data is stationary.
2.4 Scope of the Study
The scope of this study is
To become aware of Value at-risk (VaR) as a measure of risk of
portfolios
It gives simple information to layman investor to guide them in selection
of portfolio for their investment
2.5 Research Methodology
Actual collection of data : Data is the portfolio, consists of 10 elements. Four
Equities :
Reliance Industries Limited, DLF Limited, Bharti Airtel and Infosys
Technologies Limied.
Three AAA Rated Bonds :
Power Finance Corporation Limited, Indian Railway Finance Corporation and
Housing Development Finance Corporation Limited.
Three Currencies :
Dollar, Euro and Pound.
Portfolio Value at Risk
M P Birla Institute of Management Page 41
Data Source : For equities closing price is taken for period ranging from July
2007 to March 2009, for bonds weighted average price is taken for period
ranging from December 2008 to March 2009 from nse-india.com and for
currencies exchange rate is taken from oanda.com for period ranging from July
2007 to March 2009.
Data analysis : The data generated would be subjected to rigorous statistical
treatment and inferences would be drawn accordingly. Appropriate statistical
tools would be applied. Excel sheets and appropriate graphs are used as
instruments for preparing the study.
2.6 Limitations
1. The time and resources were the major constrain in conducting the
research.
2. The bond instrument have been chosen on a random basis as AAA rated
trading bonds are very less. Also the ratings have changed over the period
of years.
2. The period of study is restricted only for one year.
2.7 Chapter Scheme
1. Introduction
1.1 Background
1.2 Value at Risk
1.3 Mechanics of VaR estimation
Portfolio Value at Risk
M P Birla Institute of Management Page 42
1.4 Steps in Constructing VaR
1.5 VaR and Confidence Levels
1.6 Identifying the important Market Factors
1.7 VaR Methods
1.8 Review of Literature
2. Research Design
Statement of Problem
Research objective
Hypothesis
Scope of study
Research Methodology
Limitations
Chapter Scheme
3. Industry Profile
3.1 Equity
3.2 Bonds
3.3 Currency
4. Data Analysis and Interpretation
4.1 ADF Tests for Equities, Bonds and Currencies
4.2 Historical Simulation
Portfolio Value at Risk
M P Birla Institute of Management Page 43
4.3 Analytic Approach
4.4 Single Asset Case
4.5 Two Asset Case
5. Findings, Conclusion and Suggestions
5.1 Comparing Approaches
5.2 Conclusion
5.3 Recommendations
Bibliography
Annexure
Portfolio Value at Risk
M P Birla Institute of Management Page 44
CHAPTER 3
INDUSTRY PROFILE
Portfolio Value at Risk
M P Birla Institute of Management Page 45
This report was not done in any industry, this research is all about risk
involved in portfolio. Here, I would like to comment on the various assets,
which I have included in the portfolio, and the brief description about the reason
for taking these assets in my portfolio. My portfolio consists of :
four equities,
three AAA rated bonds, and
three currencies.
3.1 EQUITIES
Four equities are from Reliance Industries Ltd., DLF Limited (Real
Estate), Bharti Airtel (Telecom) and Infosys Technologies Ltd.(IT Sector).
Reliance Industries Ltd. (Refineries)
The Reliance Group, founded by Dhirubhai H. Ambani (1932-2002), is
India's largest private sector enterprise, with businesses in the energy and
materials value chain. Group's annual revenues are in excess of US$ 34 billion.
The flagship company, Reliance Industries Limited, is a Fortune Global 500
company and is the largest private sector company in India. Reliance enjoys
global leadership in its businesses, being the largest polyester yarn and fiber
producer in the world and among the top five to ten producers in the world in
major petrochemical products.
Reliance Industries (RIL) is the country's largest private sector company.
The company has a 26% share of the total refining capacity in India and along
with its subsidiary, IPCL, controls over 70% of the country's domestic polymer
capacity. RIL is also a major player in the polyester fiber and yarn with a
combined capacity of 2 million tones. The company has also ventured into the
Portfolio Value at Risk
M P Birla Institute of Management Page 46
upstream sector, whereby it has participating interests in existing oil and gas
fields. RIL has a large exploration acreage with 34 domestic exploration blocks
in addition to 1 exploration blocks each in Yemen and Oman. RIL also has
exploration and production rights to 5 coal bed methane (CBM) blocks. The
company also has a presence in the downstream segment and has commissioned
1,218 outlets out of permitted 5,849 outlets (FY06).
DLF Limited (Construction)
The DLF Group was founded in 1946. We developed some of the first
residential colonies in Delhi such as Krishna Nagar in East Delhi, which was
completed in 1949.
DLF is India's largest real estate company in terms of revenues, earnings,
market capitalization and developable area. In line with its current expansion
plans, DLF has over 751 million sq. ft. of development across its businesses,
including developed, on-going and planned projects. This land bank is spread
over 32 cities, mostly in metros and key urban areas across India. Already a
major player in locations across the country, DLF, with over six decades of
experience, is capitalizing on emerging market opportunities to deliver high-end
facilities and projects to its wide base of customers by constantly upgrading its
internal skills and resource capabilities.
All the intensified growth underlines DLF's commitment to quality, trust
and customer sensitivity and, delivering on its promise with agility and financial
prudence. This, in turn, has earned DLF the coveted 'Superbrand' ranking for
three years consecutively, including the current year.
Portfolio Value at Risk
M P Birla Institute of Management Page 47
Bharti Airtel (Telecom)
Telecom giant Bharti Airtel is the flagship company of Bharti
Enterprises. The Bharti Group, has a diverse business portfolio and has created
global brands in the telecommunication sector.
Bharti Airtel is one of the topmost companies in the telecom sector in
India and is under the Bharti Enterprises Group. Airtel Bharti has been ranked
as the best performance company in the whole world by the Business Week
magazine in 2007. Airtel comes to you from Bharti Airtel Limited, India’s
largest integrated and the first private telecom services provider with a
footprint in all the 23 telecom circles. Bharti Airtel since its inception has been
at the forefront of technology and has steered the course of the telecom sector
in the country with its world class products and services. The businesses at
Bharti Airtel have been structured into three individual strategic business units
(SBU’s) - Mobile Services, Airtel Telemedia Services & Enterprise Services.
The mobile business provides mobile & fixed wireless services using GSM
technology across 23 telecom circles of India and is the largest mobile service
provider in the country, based on the number of customers, while the Airtel
Telemedia Services business offers broadband & telephone services in 94
cities. The Enterprise services focuses on delivering telecommunications
services as an integrated offering including mobile, broadband & telephone,
national and international long distance and data connectivity services to
corporate, small and medium scale enterprises.
The company has around 50 million customers in 2007 and its market
share of mobile subscribers in India is at 23.4%. The company Bharti Airtel
Limited's total revenue amounted to Rs.12,242 crore in 2006- 2007 and the
net profit stood at Rs.3,126 crore. Bharti's network spans over 364,000 non-
census towns and villages in India. During the period FY05 to FY08, the
company grew its sales and profits at compounded annual rates of 49% and
Portfolio Value at Risk
M P Birla Institute of Management Page 48
74% respectively. Bharti Airtel has become a leading company in the telecom
sector in India due to the fact that it has provided the best quality of services to
its customers. And this has been possible for the company has a wide telecom
network that is of the latest technology.
Infosys Technologies Ltd. (IT Software)
Infosys Technologies Ltd, headquartered at Bangalore, India, is a leading
consulting & IT Service Solution Provider. Started in 1991, the company is
adept in technology driven and innovative business transformation initiatives.
The company works with global corporations and new generation technology
companies to deliver end- to-end solutions.
It offers services like software development, maintenance, consulting,
testing and packaging implementation. Infosys offers all these services through
its highly integrated and globally recognized delivery model. The company
achieved the US$ 3 billion revenue mark in FY07.
With a workforce of around 58,000 employees worldwide it has offices in
USA, Canada, Australia, China, UAE and European countries besides India.
Infosys is regarded as a pioneer in strategic offshore outsourcing of
software services. Its expertise is offered in Application Development and
H1 = ADF < critical values – reject null hypothesis i.e. unit root does not exist.
Interpretation
The above table tells that equities has a unit root problem i.e unit root
does not exists, so they are stationary in their 1st difference and 2nd difference at
various constraints i.e ADF is smaller than critical values at none, intercept and
trend and intercept.. The 1st
difference and 2nd difference level is nothing but the
log natural returns of the raw values. Log natural returns are to make series
mean and variance constant. Thus equity is stationary and null hypothesis is
rejected at 1%, 5% and 10% level of significance.
Portfolio Value at Risk
M P Birla Institute of Management Page 57
Bonds
The weighted average price of bonds for this test is taken from April, 2008 to March, 2009. The data taken are raw rates. The unit root result is as under:
Table No 2: Bonds ADF Test
Power Finance Corporation Ltd Indian Railway Finance Corporation
H1 = ADF < critical values – reject null hypothesis i.e. unit root does not exist.
Interpretation
The above table tells that bonds has a unit root problem i.e unit root does
not exists, so they are stationary in their level, 1st difference and 2nd difference at
various constraints i.e ADF is smaller than critical values at none, intercept and
trend and intercept.. The 1st
difference and 2nd difference level is nothing but the
log natural returns of the raw values. Log natural returns are to make series
mean and variance constant. Thus equity is stationary and null hypothesis is
rejected at 1%, 5% and 10% level of significance.
Portfolio Value at Risk
M P Birla Institute of Management Page 59
Currencies
The Exchange rate of currencies for this test is taken from April, 2008 to March, 2009. The data taken are raw rates. The unit root result is as under: