Thursday, May 24, 2012

Managing Risks Means Managing Arguments - Justin Fox - Harvard ...

So it was Lyme disease that did it! The tick-borne illness kept JPMorgan Chase's Ina Drew out of the office for extended periods in 2010 and 2011. And it was during Drew's absences, according to a richly detailed account in The New York Times, that the bank's chief investment office, which she ran, began to get into trouble:

The morning conference calls Ms. Drew had presided over devolved into shouting matches between her deputies in New York and London, the traders said. That discord in 2010 and 2011 contributed to the chief investment office's losing trades in 2012, the current and former bankers said.

Whether this really was the main reason for JP Morgan's $3 billion (and growing) trading loss or not, it does at least sound like it could be true. Managing risks ? especially the hard-to-pin-down, moving-target risks that any financial trading operation has to cope with ? inevitably involves arguing. Which is why managing those arguments, as Ina Drew appears to have done brilliantly during the financial crisis but wasn't around to do for the past couple of years, is so important.

The words "risk management" usually evokes less subjective, more data-driven pursuits. But data and objectivity can only get you so far. Philosopher Karl Popper famously proposed that to be scientific, a theory had to be falsifiable: that is, it had to make predictions that could be tested and possibly shown to be wrong. Popper spent a lot of time thinking about this definition of science and the burgeoning science of probablility, which he called propensity. (This summary is from Wittgenstein's Poker, a book I've been reading):

As far as falsification is concerned, he thought that statements involving stable propensities ? such as, 'The die has a one in six chance of landing on six' ? could be tested by looking at what happens in the long run. But isolated statements of propensity ? such as 'There is a propensity of 1/100 that there will be a nuclear holocaust before the year 2050' may resist testing, and to that extent exclude themselves from science.

Routine risks like worker safety and even some day-to-day trading hazards can thus be managed successfully with a mechanistic, scientific approach. But the kind of big-picture bets that JP Morgan's chief investment office made could never be tested, or managed, in that way. Decisions either worked out or they didn't; given the small sample size it was impossible to test what the true probabilities were.

To navigate such unquantifiable hazards, then, you need to make judgment calls. And that's where argument (or discussion, or conversation, if you prefer) comes in. You want diverse, even opposing viewpoints. You want to manage their interactions in a way that allows the quieter, less-senior, less-predictable voices to be heard. You probably do want to accord different weights to the arguments of different people, although deciding how to do so (past track record? clarity of argument?) is hard.

In any case, it should be clear that you don't want to just let the loudest voices win. When that happens, everybody loses. I once heard David Modest, a former partner in the infamous hedge fund Long-Term Capital Management, attribute the unraveling of LTCM to exactly this: a few very certain, very loud partners had been able to override the more tentatively stated doubts of the rest to steer the firm into oblivion. This fits with Philip Tetlock's division of forecasters into hedgehogs and foxes ? those who are more certain (the hedgehogs) are also more likely to be wrong.

This is one reason why the standard risk-management set up at Wall Street firms proved such a bust in the financial crisis. Risk-management chiefs were clearly subordinate at most banks to those who brought in the big bucks, and thus incapable of winning arguments when it really mattered; it was only at places like Goldman Sachs and JPMorgan where the CEO saw himself as risk-manager-in-chief that the process seemed to work at all.

The point applies beyond the financial sector. Successfully managing most of the biggest risks that businesses and societies face requires successfully managing arguments about what exactly those risks are and how seriously they should be taken. It may be that we'll eventually be able to systemize those arguments in a useful way, as computer scientist Peter McBurney and colleagues have been trying to do for the past few years. But in the meantime it's much more art than science.

whitney houston autopsy dobie gray bruce springsteen grammy nominations lil boosie new edition austerity

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.