The spreadsheet did it.

This goes back to the London Whale’s multibillion-dollar botch at JPMorgan last year. Recall that Morgan had just implemented a new Value at Risk model. Using it, the bank underestimated VaR, which helped trigger the debacle.

So what happened? Excel happened:

The new model “operated through a series of Excel spreadsheets, which had to be completed manually, by a process of copying and pasting data from one spreadsheet to another.” The internal Model Review Group identified this problem as well as a few others, but approved the model, while saying that it should be automated and another significant flaw should be fixed. After the London Whale trade blew up, the Model Review Group discovered that the model had not been automated and found several other errors. Most spectacularly,

“After subtracting the old rate from the new rate, the spreadsheet divided by their sum instead of their average, as the modeler had intended. This error likely had the effect of muting volatility by a factor of two and of lowering the VaR . . .”

The Baseline Scenario blames Excel’s shortcomings, rightly noting . . .

. . . while Excel the program is reasonably robust, the spreadsheets that people create with Excel are incredibly fragile. There is no way to trace where your data come from, there’s no audit trail (so you can overtype numbers and not know it), and there’s no easy way to test spreadsheets, for starters. The biggest problem is that anyone can create Excel spreadsheets—badly. Because it’s so easy to use, the creation of even important spreadsheets is not restricted to people who understand programming and do it in a methodical, well-documented way.

True as far as it goes. But it also shows an epic ERM failure. If you are going to pin a company’s future on a model, said model should be thoroughly tested. A model that doubles or halves the old model’s calculations should be presumed to be incorrect, then checked, double-checked and triple-checked to make sure that it’s calculating correctly. There should also be some theoretical and heuristic support for why the old model’s calculation was inferior and why the new model is superior.

Further, a model that critical to a company that large just has to be more robust. It should use code to extract and transfer data from sheet to sheet. It should be engineered to minimize manual inputs. All of those inputs should take place on one screen. These things should be in place before you change models. You shouldn’t be able to catch up on the upgrade.

And responsibility for all that should go to the highest level at the company, here CEO Jamie Dimon.

I’ve seen companies one-sminteenth the size of JPMorgan do all of these things with Excel. Hard to think the big bank was so rigorous when a formula uses =sum() instead of =average() halves a result and nobody notices.

All of which helps one better understand what’s behind the Agnes Rule: If banks sold anything but money, they’d go broke.

Advertisements

4 thoughts on “The spreadsheet did it.

  1. Alex says:

    “A model that doubles or halves the old model’s calculations should be presumed to be incorrect, then checked, double-checked and triple-checked to make sure that it’s calculating correctly. There should also be some theoretical and heuristic support for why the old model’s calculation was inferior and why the new model is superior.”

    Unless the direction of the change helps management maximize current extractable value…
    =============================================
    “And responsibility for all that should go to the highest level at the company, here CEO Jamie Dimon.”

    Who is undoubtedly quaking in his boots about the SOX violations he’ll be charged with any minute now.

    • Alex says:

      Wait, it gets better! From page 8 of the Levin report:

      “In the case of the CIO VaR, after analysts concluded the existing model was too conservative and overstated risk, an alternative CIO model was hurriedly adopted in late January 2012, while the CIO was in breach of its own and the bankwide VaR limit. The bank did not obtain OCC approval as it should have. The CIO’s new model immediately lowered the SCP’s VaR by 50%,
      enabling the CIO not only to end its breach, but to engage in substantially more risky derivatives trading. Months later, the bank determined that the model was improperly implemented,
      requiring error-prone manual data entry and incorporating formula and calculation errors. On May 10, the bank backtracked, revoking the new VaR model due to its inaccuracy in portraying
      risk, and reinstating the prior model.”

      So, JPM “hurriedly adopted” the new VaR model while already “in breach of … the bankwide VaR limit.” Will Excel be charged as an accomplice?

  2. Ralph Baxter says:

    What are those parables about throwing stones in glasshouses or removing the beam from one’s own eye. We have analysed many actuarial environments where risk calculations are based off hundreds or thousands of interlinked Excel workbooks. Chief Actuaries tend to respond ‘that’s fantastic – the first time I can show people how it all works’. CEOs tend to ‘OMG – that’s how it all works’

    See http://www.clusterseven.com/spreadsheet-links-map if you want an example.

  3. […] Actually, that’s not fair. Morgan management didn’t ignore the alarms. They just built a new clock – here, a new VaR model. Rather famously, the new model, built in haste, sucked. […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: