This goes back to the London Whale’s multibillion-dollar botch at JPMorgan last year. Recall that Morgan had just implemented a new Value at Risk model. Using it, the bank underestimated VaR, which helped trigger the debacle.
So what happened? Excel happened:
The new model “operated through a series of Excel spreadsheets, which had to be completed manually, by a process of copying and pasting data from one spreadsheet to another.” The internal Model Review Group identified this problem as well as a few others, but approved the model, while saying that it should be automated and another significant flaw should be fixed. After the London Whale trade blew up, the Model Review Group discovered that the model had not been automated and found several other errors. Most spectacularly,
“After subtracting the old rate from the new rate, the spreadsheet divided by their sum instead of their average, as the modeler had intended. This error likely had the effect of muting volatility by a factor of two and of lowering the VaR . . .”
The Baseline Scenario blames Excel’s shortcomings, rightly noting . . .
. . . while Excel the program is reasonably robust, the spreadsheets that people create with Excel are incredibly fragile. There is no way to trace where your data come from, there’s no audit trail (so you can overtype numbers and not know it), and there’s no easy way to test spreadsheets, for starters. The biggest problem is that anyone can create Excel spreadsheets—badly. Because it’s so easy to use, the creation of even important spreadsheets is not restricted to people who understand programming and do it in a methodical, well-documented way.
True as far as it goes. But it also shows an epic ERM failure. If you are going to pin a company’s future on a model, said model should be thoroughly tested. A model that doubles or halves the old model’s calculations should be presumed to be incorrect, then checked, double-checked and triple-checked to make sure that it’s calculating correctly. There should also be some theoretical and heuristic support for why the old model’s calculation was inferior and why the new model is superior.
Further, a model that critical to a company that large just has to be more robust. It should use code to extract and transfer data from sheet to sheet. It should be engineered to minimize manual inputs. All of those inputs should take place on one screen. These things should be in place before you change models. You shouldn’t be able to catch up on the upgrade.
And responsibility for all that should go to the highest level at the company, here CEO Jamie Dimon.
I’ve seen companies one-sminteenth the size of JPMorgan do all of these things with Excel. Hard to think the big bank was so rigorous when a formula uses =sum() instead of =average() halves a result and nobody notices.
All of which helps one better understand what’s behind the Agnes Rule: If banks sold anything but money, they’d go broke.