Thursday, March 12, 2009

We tossed away our mature "mainframe" talent and now we have a financial crisis; does the government really have a "system" to monitor the "toxins"?


In reading repeated media stories about the financial crisis, I’ve noticed that the government seems to have no straightforward way to calculate the risk that it has taken on.

It is common in public policy organizations (like the Fed, and Treasury, even if they do different things) to run simulation models to predict financial results. A company that I worked for in 1989 did this with Medicare operating margins. We had an analyst who modeled all the equations (based on Propac and the Federal Register rules) and a contract programmer who had coded these equations into a series of COBOL and SAS jobs based on public policy parameters, various medical inputs like case-mix, and claims data.

In a financial simulation, the same overall setup to doing the work should apply. The government would need to run a model that can walk a database that models all the “assets” (that is toxic assets) in a financial institution, and calculate valuations.

The Time Magazine issue from March 9, 2009, p. 30, “One Bad Bond” provides some diagrams that might help illustrate the point. Ordinary mortgage bonds comprising performing mortgages don’t provide much problem, but what Wall Street did was to take the riskiest or “subprime” mortgages and package them as the lower “tranches” of “CDO’s”, in such a way that losses are multiplied.

Each of these securities, however, has a structure that can be represented in a database record on a schema. The structure is relatively hierarchal (as in IMS) or perhaps more networked (as in IDMS) rather than just relational. This sort of data lends itself to being analyzed by traditional batch mainframe (usually COBOL) programming, highly “structured”, with routines to walk through all of the entities in the database (in IDMS they would be SETs that invoke all the original mortgage contracts wherever they appear in the tranches of the CDO’s). The jobs have to be set up to be run automatically according to schedules. The data would have to be converted from various formats that the individual financial institutions would have and loaded into a common format on the database. All of this activity is well known information technology that was “popular” back in the 1980s. It’s not the new sexy world of Web 2.0 or e-commerce. It’s the “boring”, batch, stable, highly secure world of IBM MVS (or OS390) and S0C7’s.

True, some of the banks might have new-age formats for their data, as instances of classes in object oriented environments. But it’s possible to design a system like this with OOP techniques in COBOL.

But from everything I hear, it doesn’t sound as though the government has its arms around any of this. It takes something like a year form IT professionals to develop the information requirements, define the processing, the database, code, test, and QA and validate results. I did this for thirty years. I wonder how in the world the Treasury Department and Fed are going to get this done. They have to make up their minds to do it with technology that is somewhat forgotten, with many of the older professionals used to this kind of work retired,

After Y2K, most employment in this sort of thing tended to be contract-based, related to specific programs or legal changes like MMIS or HIPAA. Perhaps the same will be true of this. But we’ve cast aside a lot of the old fashioned bread-and-butter IT professionalism of the pre-Y2K era, and suddenly we have the largest financial collapse since the Great Depression. Is there a relationship?

I’m 65, and I think I can help “you” solve this problem. I’ll network, but let me know what “you” think.

No comments: