Monday, March 10, 2008

Recap: did our "culture" of computing really change radically over the decades? Yes, because end-users demanded it.


So how did the nature of the IT job market change over the years? I think someone – me – could write a book about it. Let’s say it all came in overlapping stages.

Remember the days of coding sheets and punched cards? Remember those run decks of “compile, load and go” back in the 60s?

IBM announced its 360 operating systems in 1964, and had pretty widely demonstrated them by 1966. By the late 60s and early 70s, companies were starting to develop their bread-and-butter batch processing and “time sharing” systems, often with much of it in assembler and DOS, which COBOL (and FORTRAN sometimes) gradually replacing it, and with MVS coming to the fore in the 70s. “Data processing” was a mystery then. Companies like EDS would hire and train “systems engineers” and instigate an almost military culture and dress code in order to keep the confidence of customers. (In fact, H. R. Perot started out by hiring mostly military officers: “Eagles don’t flock, you have to hire them one at a time.” I remember seeing Exchange Park when I lived in Dallas, but by then EDS was long gone to Forest Lane. IBM was also notorious for its dress code, even checking for garters.

In the 70s there were other companies competing with IBM for the mainframe market, including Univac and Burroughs. Univac actually had a more programmer-friendly environment, a simpler JCL (Exec 8, much like today’s Unix) and terminal access for programmers. But IBM dominated the job market and gradually drove out the competition. By the mid to late 70s, most major companies employed applications programmers to write, maintain and support their applications systems, which always then relied heavily on overnight and end-of-month batch cycles, requiring on-call support when jobs went down (the “dreaded” SOC7 – or SOC4).

In the 80s, job submission and source management became much more automated and sophisticated, and by the end of the 80s large shops tended to have reasonable security, denying programmer access to production files on a regular basis. (There were always a lot of accidental loopholes, like the IDMS Central Version.) By the late 80s, most employers expected job candidates to have online experience with the main teleprocessing monitor, CICS, first in Macro and then Command Level mode. Most also expected a database, which was usually IMS (hierarchal) or IDMS (network) or even Datacomm (inverted list), or Adabase, to be gradually replaced by DB2. Mainframe legacy applications, often originally in-house written, were being replaced by vendor supplied software (like Vantage for life insurance), and jobs tended to require experience with these packages.

But PC’s had been introduced to home users in the early 80s, and gradually businesses were integrating them into their workflow. A number of applications, including databases (like dBase III+ and dBaseIV) pretended to simulate for users what companies did. In companies, end users wanted more control over their computing, and portions of databases would be offloaded onto desktops with simple DOS applications, sometimes in languages like Microfocus COBOL. Offices had local area networks for documentation. But PC’s were still used mainly to emulate 3270 terminals to get into mainframes, and users from home could dial in with products like Procomm and then PC Anywhere. By the middle 90s, companies were establishing direct connectivity to databases (especially DB2), or replicating data to mid-tiers and using Unix servers and various databases and OOP languages to present GUI’s to users. The culture of online programming with character-based screens would gradually be replaced with very sophisticated end-user applications in object oriented languages, especially java, which would become a production standard in amazingly short time, and end-user applications had amazing “artificial intelligence:.

Mainframe jobs would explode before Y2K as companies would have to retrofit old applications for the Y2K midnight event (that really stretched for some time), but after the 2000-2001 recession mainframe jobs tended to be replaced by contracting gigs that required very specific experience. But many older programmers had difficulty keeping up with the incredibly rapid change in computing style after 2000, partly because of a basic misunderstanding in how it should be learned. Companies really did not replace legacy systems in COBOL with OOP systems; they build incredibly intelligent end-user workstation applications in the new languages, while the legacy applications became more static and tended to demand more specialized expertise to maintain. (Vantage has its own world with its bizarre IO modules and linkedit decks and, as they say, “rules the world” for professionals who work with it.) The job market became difficult (besides economic vicissitudes and offshoring or more conventional jobs) partly because the skillsets became so specialized and fragmented; the professional had to predict what would be hot and what he should become an expert in, and predict accurately.

No comments: