What is it?
Invented by JPMorgan in the 1980s, VaR, or Value at Risk, is a way of measuring the amount of money a bank can expect to lose on its portfolio of tradable assets (eg, stocks and bonds) if markets plummet. VaR is calculated for a specified period of time and for a specified level of confidence.
There are various methods of calculating VaR, but one of the most popular is historical simulation. In this case, the bank runs a mathematical model looking at events in the previous three to four years in order to gauge how much money it would lose if the most extreme events over that period were to repeat themselves.
VaR is calculated for different time periods according to a) how long the bank plans to hold its portfolio, and b) how quickly it can sell the portfolio on. The Basel rules stipulate that banks use VaR measured over a 10-day period to determine how much capital they must hold to cover their trading activities.
What's it got to do with the financial crisis?
With banks suffering $374bn of writedowns (as of Sept 2008) and counting, you might think VaR would have leapt in the months and years prior to the credit crunch. And you'd be right.
For example, VaR at Lehman Brothers rose from $39m (for a one-day holding period) in 2001 to $117m in the fourth quarter of 2007. Similarly, VaR at Morgan Stanley rose from $46m to $146m over the same period. And VaR at UBS went from $47m to $169m.
So what went wrong? Part of the problem seems to be that banks were just so busy making money that they didn't take much notice about what was happening to VaR. And with banks' share prices rising during this period, investors didn't seem to care much about the issue, either.
VaR has also been criticised for being too backward looking: because it's often calculated based on historical scenarios, if the historical period is a very benign one, VaR won't be a good indicator that something nasty lurks in the future. Nassim Taleb, professor, author and a former quantitative trader, argued as far back as 1997 that traders using VaR would fall victim to complacency by believing the measure to be reliable and concrete, instead of a rough average.
Finally, VaR only looks at how bad things are likely to get in 99% of cases. But theories such as the 'black swan' hypothesis suggest that the 1% of cases which are considered unlikely happen more frequently than expected - and they're the ones which wreak havoc. One example is the market-crushing bankruptcy of Lehman Brothers, which figured into the most popular measures of many banks' VaR estimates for a year. Soon banks will be able to take risks again without figuring in the troubled post-Lehman markets, which will be an intriguing test of the financial industry's willingness to police its own risks.
In the case of the credit crunch, VaR didn't stimulate banks to take appropriate action to reduce their risk exposure in time to avoid big losses. Nor did it reflect the huge losses that could result if the whole system became dysfunctional. Either banks need a new measure of risk, or they need to take more notice of the one they've got.
Last updated on 7 September 2009.
Return to A-Z home page