Thursday, December 21, 2006

Mainframe to client-server, more


I’ve noticed a marked increase in the number of calls regarding mainframe contracts in the second half of 2006. Although many of them look for Vantage 1 and DB2, in general the requirements have become less specific than they were even a year ago.

Yet in the late 1990s everyone said that anyone who didn’t switch to “client-server” (or to "open systems") would become unemployable after Y2K was over. Remember that Y2K generated a tremendous demand for routine mainframe programming (including ALC) in the late 1990s, and it did drop off precipitously right after 2000 started. There have been other blips, though: MMIS, HIPAA, the Euro, and a variety of other social welfare programs, and even some special projects for the IRS. Many code maintenance jobs went overseas, especially to India, and sometimes this included night support (to take advantage of global geography and astronomy). There seems to be some evidence that some of this work is coming back home. Offshore outsourcing does not always save all the money that companies expect, and sometimes valuable expertise is lost.

In the 1990s it got fashionable to say that older professionals couldn’t learn the new stuff. This has been a subject of some controversy before. The real techies started out by running their own webservers at home in the middle 1990s (one friend of mine did this on a 386 machine -- he would scold me for my "astonishing lack of curiosity" when I did not play around with downloading various software just to experiment, which is what you had to do then to learn this stuff-- and another developed a web hosting business that he ran; I was his stable customer for four years). Yet, this developed just after news commentators started talking about the Web on CNN and while AOL and Prodigy still lived off of their proprietary content. (How that has changed! Prodigy was a bit clownish in those days.)

Those were the days, my friend (as in the 1968 song). 2400 baud was an acceptable way to get email at home, and even at work, until maybe 1991 or so, 9600 baud was a bit of a luxury if you had a remote mainframe. The rapid improvement in connectivity options no doubt help speed up the corporate mergers in the 90s.

By the late 90s, companies like the insurance company that I worked for had developed a paradigm of legacy systems and cycles on the mainframe, fancy COBOLMVS batch replications to a Unix midtier, C-code (procedural, not object) screen-ems (off CICS), a little DB2 direct connect, TPX, a data access layer in java with a context factory (I think this fits into the OSI model somehow), a C++ bridge, and a GUI, in this case in Powerbuilder. I moved over to supporting this for two years, and found getting employability-guaranteeing expertise in these areas difficult to get when just responding to user calls or fixing minor bugs. (The tools were a bit backward: the Unix systems had Hummingbird, which was like a slow TSO/ISPF, and code could be edited in VI (with its 24 simultaneous buffers, or Emacs), which many techies were familiar with but which seems clunky compared to ISPF. Java seems easier to pick up than Powerbuilder. But I took a course in C# with Visual Studio ,NET at a technical college near Minneapolis before moving back to DC, and I found that C# was much more straightforward than any of these.

The only way to get good at this stuff is to do it, and to spend a couple of years in developing a system, going though the unit testing, QA with user testing, implementation, and support. Just doing post-imp support isn’t enough; you have to do the whole thing. So making the "switch" (to "open systems") is a several-year commitment. Now Visual Studio ,NET looks like a much more straightforward environment than anything my company had – but what about platform independence. And instead of screen-ems and replications, it seems like XML is a much more straightforward technology for moving data around. Health care companies have used it heavily, as it seems to fit in to HIPAA compliance more easily.

Of course, what I want to do with my own “knowledge management”, as discussed on my other blogs, is to get it into a database and have an intelligence engine (something like a super data access) connect all the dots. Right now, Visual Studio may be the most straightforward way to do this. You can download Express for free and work with the database and webserver portions separately, but to make something runnable, it looks like you need Visual Studio Professional, close to a thousand bucks. But Microsoft keeps changing this (they need your money, though), so I will see.

Picture: operations research and dynamic programming from 1970, RCA Spectra environment.

No comments: