Tuesday, June 21, 2011

What does the financial crisis have to do with Three Mile Island?

A lot, if one believes the Financial Times columnist and book author Tim Harford. For his book Adapt, he researched the parallels between security in engineering and security in financial markets. Harford singles out three main issues which put the stability of the financial markets in danger: complex structures of financial institutions, interconnectedness of financial institutions and a lack of control through regulators.

The more complex a financial institution, the more likely it is that risks will not be recognized until it is to late. Harford compares the meltdown of Lehman Brothers with the failing reactor at Three Mile Island in the United States. The reactor, he says, started overheating at 4 a.m. when knowledgeable personnel was absent, the control room was difficult to understand due to its impractical design and security systems reinforced the catastrophe rather than attenuating it.

Controlled explosion of a bank/CC BY-NC
total_incompletion
In the financial meltdown of Lehman Brothers, Harford sees the same process. Pricewaterhouse & Coopers was asked to organize Lehman Brothers Europe's orderly insolvency but when its consultants arrived, they had no idea how to understand the complexity of the institution with its numbers of divisions, assets and real estate. They ended up following Lehman Brother employees around to understand what there job actually was.

Likewise, politicians were not given adequate information permitting them to handle the financial crisis responsibly. Harford cites an example of Tim Geither, the head of New York FED at the time of the Lehman Brother collapse, who received the information about AIG's imminent breakdown at 4 a.m. after a transatlantic flight on a handwritten DIN A4 paper with a lot of numbers.

Finally, security systems reinforced the financial meltdown. According to Harford, even small banks that didn't engage in risky speculation often took out an insurance with a re-insurer like AIG. The goal: if for whatever reason customers should withdraw more money than the bank had in cash, the remainder would be covered by the re-insurer. Given that most re-insurers were rated AAA, this also gave the small bank an AAA rating (as it was now absolutely certain that the bank could service its debt). However, in practice re-insurer A also had a re-insurance contract with re-insurer B, who had a contract with re-insurer C, who had a contract with re-insurer - A! At the moment where A's panicking clients withdrew money, it not only pulled B and C into the abyss, but also the small commercial bank. Its portfolio was suddenly no longer insured and became rated C; it did not obtain any more loans from other financial actors except for skyrocketing interest rates.

The example shows two things. Not only was there a latent error lurking in the equation: The security system would fail exactly at the moment when everybody sold stocks and rushed to the banks to withdraw their money. But the security system also put players in danger that it was supposed to protect. Besides, it gave an incentive to insured banks to take a greater risk, certain that they would be covered if their speculation backfired.

Harford draws four conclusions from his findings: it is crucial to
  • understand the structure of a financial institution to reduce risks and latent errors
  • not only understand but reduce complexity of this institution, 
  • decouple regular bank activities from risky activities so that they cannot be endangered by the collapse of the speculative sector and 
  • encourage whistleblowers within the institution to uncover risks and to communicate them. 
I encourage you to listen to the entire talk Tim Harford recently gave at the LSE. You can find it here. 

No comments:

Post a Comment