Friday, April 03, 2009
Could I.T. have helped forestall the financial crisis?
Robert O’Harrow, Jr. and Jeff Gerth have a story about the history of information systems on Wall Street that meshes with today’s financial crisis with a bit of irony.
As recently as 2005, the trading of credit derivatives was carried on in paper, pen and fax. The mechanical clumsiness of the trading system was seen as creating a financial peril, because deals could not be closed quickly enough. What’s ironic now is that Geithner’s New York Fed did not grasp the new risk that would accrue once the mechanics of the system became more efficient. The story, on the front page of today’s (April 3) Washington Post is here.
Of course, what was missing was the ability to, in some sort of systematic processing cycle well known to programmers in all major financial institutions, “valuate” the products and give management the ability to assess risks. Was this a failure at a certain human psychological level, or just a failure in business systems analysis, to develop the aggregate reporting tools need to see what was going on?
I remember, from failed “Combined Medicare Project” that I worked on from 1979-1981, that a great deal of emphasis was to be placed on postpayment utilization review, with complete statistical reporting on trends in use. In financial services companies, complex valuations of products are always run at the end of each month’s cycle. Does this need to be done by the Fed and Treasury, much higher up the food chain?