Thursday, December 21, 2006

Mainframe to client-server, more


I’ve noticed a marked increase in the number of calls regarding mainframe contracts in the second half of 2006. Although many of them look for Vantage 1 and DB2, in general the requirements have become less specific than they were even a year ago.

Yet in the late 1990s everyone said that anyone who didn’t switch to “client-server” (or to "open systems") would become unemployable after Y2K was over. Remember that Y2K generated a tremendous demand for routine mainframe programming (including ALC) in the late 1990s, and it did drop off precipitously right after 2000 started. There have been other blips, though: MMIS, HIPAA, the Euro, and a variety of other social welfare programs, and even some special projects for the IRS. Many code maintenance jobs went overseas, especially to India, and sometimes this included night support (to take advantage of global geography and astronomy). There seems to be some evidence that some of this work is coming back home. Offshore outsourcing does not always save all the money that companies expect, and sometimes valuable expertise is lost.

In the 1990s it got fashionable to say that older professionals couldn’t learn the new stuff. This has been a subject of some controversy before. The real techies started out by running their own webservers at home in the middle 1990s (one friend of mine did this on a 386 machine -- he would scold me for my "astonishing lack of curiosity" when I did not play around with downloading various software just to experiment, which is what you had to do then to learn this stuff-- and another developed a web hosting business that he ran; I was his stable customer for four years). Yet, this developed just after news commentators started talking about the Web on CNN and while AOL and Prodigy still lived off of their proprietary content. (How that has changed! Prodigy was a bit clownish in those days.)

Those were the days, my friend (as in the 1968 song). 2400 baud was an acceptable way to get email at home, and even at work, until maybe 1991 or so, 9600 baud was a bit of a luxury if you had a remote mainframe. The rapid improvement in connectivity options no doubt help speed up the corporate mergers in the 90s.

By the late 90s, companies like the insurance company that I worked for had developed a paradigm of legacy systems and cycles on the mainframe, fancy COBOLMVS batch replications to a Unix midtier, C-code (procedural, not object) screen-ems (off CICS), a little DB2 direct connect, TPX, a data access layer in java with a context factory (I think this fits into the OSI model somehow), a C++ bridge, and a GUI, in this case in Powerbuilder. I moved over to supporting this for two years, and found getting employability-guaranteeing expertise in these areas difficult to get when just responding to user calls or fixing minor bugs. (The tools were a bit backward: the Unix systems had Hummingbird, which was like a slow TSO/ISPF, and code could be edited in VI (with its 24 simultaneous buffers, or Emacs), which many techies were familiar with but which seems clunky compared to ISPF. Java seems easier to pick up than Powerbuilder. But I took a course in C# with Visual Studio ,NET at a technical college near Minneapolis before moving back to DC, and I found that C# was much more straightforward than any of these.

The only way to get good at this stuff is to do it, and to spend a couple of years in developing a system, going though the unit testing, QA with user testing, implementation, and support. Just doing post-imp support isn’t enough; you have to do the whole thing. So making the "switch" (to "open systems") is a several-year commitment. Now Visual Studio ,NET looks like a much more straightforward environment than anything my company had – but what about platform independence. And instead of screen-ems and replications, it seems like XML is a much more straightforward technology for moving data around. Health care companies have used it heavily, as it seems to fit in to HIPAA compliance more easily.

Of course, what I want to do with my own “knowledge management”, as discussed on my other blogs, is to get it into a database and have an intelligence engine (something like a super data access) connect all the dots. Right now, Visual Studio may be the most straightforward way to do this. You can download Express for free and work with the database and webserver portions separately, but to make something runnable, it looks like you need Visual Studio Professional, close to a thousand bucks. But Microsoft keeps changing this (they need your money, though), so I will see.

Picture: operations research and dynamic programming from 1970, RCA Spectra environment.

Monday, December 18, 2006

More on public exposure for IT professionals


I have visited the issue before of social networking sites and blogs by I.T. professionals. Employers have become more concerned about these in the past year or so. I suppose some employers would actually like to see participation of candidates on technical blogs, but there can be issues of confidentiality.

There can occur a situation where someone is placed with a client by an agenting company, and the client becomes concerned about the "reputation" of the contractor from Internet content in areas outside of direct job relevance.

There has also been startup companies that promise to manage the "online reputations" of people and that also want to manage their online presence for "appearance" or public relations purposes. Agenting companies might fear that a client would, if it found a contractor's poliitical or personal materials on the web, perceive the contractor as less than "professional" about his/her public space use, but this notion is very subjective.

Generally, as an individual contributor in a W-2 situation, I would not allow an outside company to manage my "reputation" or public appearance (as if it were clothing -- "Sartor Resartus") and I do not believe that this is necessary. In a corp-to-corp situation, and especially if the agenting company pays salary on benefits while I would be "on the bench", I do agree that this is a much more important issue. In such a situation, the agency is selling the candidate as a professional on the specific subject matter, and a major outside Internet presence on competing issues (especially political ones) could create confusion.

I have more concrete statements about this issue at my Johnwboushka site

Also, the Persistence Policy, and suggested blogging policy.

I added JCL to my certifications on Dec 8, 2006. Go to this link.

Earlier posting from Sept 15, 2006.

Sunday, December 10, 2006

IBM Mainframe databases

The most commonly desired database is DB2, which first appeared around 1983. I recall a telephone interview in 2002 in which I was asked about "indexable predicates" and to give circumstances where a full outer join would be used.

The next most common is IMS with the DL1 command language, the world of PCBs and PSBs. Sometimes IMS-DC is needed as a TP monitor, as it is a rare skill, since most people learned CICS only, and since relatively few installations still have it.

A simpler "relational" system was ADR Datacomm DB along with Datacom DC, which Chilton Credit Reporting in Dallas used in the 1980s (before takeover by TRW, eventually leading to a spinoff as Experian). It worked essentially as an inverted list. It did not make one very marketable for the job market entering the 1990s.

IDMS is a "network" system, originating with Cullinane and then belonging to Computer Associates from the 1990s. It had a fourth generation online language called ADS(O) which supplanted the need for conventional command level CICS programming. IDMS could work with files in VSAM format as well as its own proprietary format.(By the way, you can do command level CICS in Assembler was well as COBOL, but Macro level CICS was usually only in Assembler).

Sperry Univac (in the 1970s) had a DBMS similar to IDMS, called DMS-1100.

Another common database used to be ADABAS, with the accompanying 4GL NATURAL.

For life insurance and annuities, VANTAGE developed a proprietary system to access either VSAM files or DB2 with such specific structures that it is practically a DBMS on its own right, with very specific call structures and link deck conventions, that often require gurus to maintain.

Thursday, December 07, 2006

Review topic: JCL, MVS

hiperspace -- comparable to a dataspace but resides in extended storage and helps jobs run more efficiently

ICF -- Integrated Cataloge Facility
VSAM files must be catalogued

JES3 ADDRSPC VIRT is default REAL means storage is non-pageable

DCB you can use OPTCD to read and write data in ASCII

examples of pre-printed forms

//STATEMENT DD SYSOUT=(B,,STM1),FCB=STM1
or
//OPT1 OUTPUT CLASS=G,FORMS=STM1,FCB=STM1
or for laser printer
//OPT1 OUTPUT FKASH=STM1

The OUTPUT statement is used to specify parameter sets for multiple SYSOUT DD's within a particular step. The DEFAULT=YES on an output statement makes the parameters apply to all SYSOUT DDs that don't specify an alternate statement. The DEST subparameter is often used to direct specific forms to other remote printers. In practice, when doing laser printing, many shops require that the programmer insert various special characters in the print line to further control printing, especially in automated stacks with client breaks indicated by various colored papers (agent commission statements, for example). Printer vendors sometimes require content control that goes beyond what is commonly accomplished by JCL output control parameters and subparameters.

Remember that in cataloguing or referencing a tape dataset, the VOL parameter is very useful. A number after the second comma may reference which reel of a multiple volume set, and the number after the third comma can specify the max reels to create. A default of 5 is assumed. In one small consulting company in 1989 (in a 4381 environment, the "small" mainframe of the time), we kept summary data on multi-file reels, and had to be very skilled in reading them back correctly with various parameters; we were in an environment where we paid for disk space and computer time and had an economic incentive to resue summary data. (We also had to manipulate the Medpar detail on various volumes a lot.) Times have changed tremendously since then!

It is often desirable and more professional to allocate temporary datasets within virtual storage. UNIT=VIO will accomplish this. Surprisingly, many shops do not bother to do this. Shops generally encourage the use of temp sets, however, and will monitor programmers' use of unnecessary catalogued datasets.

JES2 and JES3 handle scheduling jobs on specific processors differently. In JES2 you use the /*JOBPARM statement with the system affinity (SYSAFF) parameter. In JES3 you use the //*MAIN statement (two slashes) with the SYSTEM parameter. A global processor is in charge of a whole network, and controls local processors.

Back in the late 1970s at Bradford we used CHKPT a lot on tape jobs, because we had sequential jobs that processed millions of records of MMIS claims data tapes. The EOV parameter takes a checkpoint at the end of writing each volume. This was very useful in being able to finish production processing in time, especially with programmers on call on their own salaried time.

Thursday, November 30, 2006

My Brainbench COBOL certification

From the summer of 2002 to 2005 I had Brainbench certifications in COBOL, JCL, and ANSI SQL. I have started renewing these.

I passed the first certification (COBOL II) on Nov 20, 2006.

The information is:

COBOL II, score 3.70 (possible range is 1.0 to 5.0)

percentile: 87

Strengths: Data Division, Process statements, File processing

transcript link: Here (go to Brainbench link).

The next test will be JCL and I expect to get to it next week.

Thursday, November 16, 2006

Review: COBOL: Rounding and mathematical statements

I took both parts of the Brainbench COBOL II refresher Beta Test Thurs Nov 16 and scored in the 61-st percentile and then 93rd.

Consider this statement:

MULTIPLY FIELD1 BY FIELD2.
This is an old-fashioned way to code, and the result is stored in FIELD2. It is not rounded unless ROUNDED is specified. ON SIZE ERROR can specify a routine to perform if the picture clause for the receiving field is exceeded.

You can specify a REMAINDER clause on division (just as in grade school arithmetic), or use a REM function.

The COMPUTE statement usually makes code more readable, with parentheses. But you should know the order of operations (or order of precedence), which is the sames as in Algebra I ! This commonly is tested on multiple-choice quizzes.

Operations are performed from left to right. Exponentiation is done first. Then go back and do multiplications and divisions. Finally do additions and subtractions.

Wednesday, November 15, 2006

Review COBOL: External, Global Clauses

The External Clause is associated with object-oriented programming.

The EXTERNAL clause says that a data item or a file description item is external. The data items and group items of an external data record are accessible to every runtime element in the run unit that defines that record. This would mean that the items can be accessed by other items not included in a link deck.

A typical reference is here.

In Microfocus it is possible to secify EXTERNAL on the ASSIGN clause of the SELECT statement so that external files may be accessed. Link for Microfocus is here.

Review COBOL ENTRY and MVS abend list

The COBOL ENTRY statement allows a program to start at an alternate place. It is more common in Microfocus and on z/OS; on the mainframe people tend to code separate modules.

IBM has a definitive discussion at its Boulder CO site, here

It should be studied in conjunction with the CALL statement, reference here:

Note the USAGE IS PROCEDURE-POINTER and USAGE IS FUNCTION-POINTER.

This reference has a comprehensive list of the common MVS abends. The most common and notorious (for nightcall support programmers) is S0C7. But others are interesting. The "empty files" is usually an 0001. The S0C5 can happen when falling through an ENTRY statement.

Abend list

Monday, November 13, 2006

Review: Cobol (first quiz)

I took the first Brainbench refresher quiz Friday (Nov. 10) on COBOL II and scored in the 98 percentile. The quiz has 20 questions. For the refresher quiz, there is considerable time.

Just a few topics came up. I won't give away any test material, but I'll continue some review topics.

One idea is that the EVALUATE and WHEN structure can execute only one WHEN clause. It then stops. So the order of the clauses matters if more than one can be true (Murach p 198). This structure is often simpler and more elegant than the equivalent "Nested IF" structure. Professional consultants on contracts would probably be expected to code with this technique when possible.

If you want a program to abend automatically if it goes out of range on an array, you should use the SSRANGE compiler option.

It is common in replication programming or in COBOL programs that access older files with bit coded fields (life CFO masters in life insurance) to access specific fields with a starting position (colon :) character count.

COMP-1 (single precision floating point, 4 bytes) and COMP-2 (double precision floating point) do not have picture clauses.

The maximum size of an 01 level in COBOL II is 16777215

A good link for COBOL interview questions is here.

Previous experience: end user computing flexibility

From 1979-1981 I worked for a "Combined A&B Medicare Consortium" put together by six or seven Blue Cross/Blue Shield Plans that aimed to put together a state-of-the-art claims and back end processing system to sell to all plans. The project was hosted by Blue Cross and Blue Shield of Texas and was always hosted in the Stemmons Freeway area of Dallas, near Oak Lawn. At the same time, BCBS Texas signed a contract with EDS at the time for Medicare processing of its own, creating quite a tense political situation that eventually helped undermine the project.

The project tried to use the Bryce "Pride-Logik" methodology for specifying all of the system components through various stages of the full system life cycle. I worked on the back end systems, particularly Surveillance and Utilization Review.

Because it was so difficult to get the Plans to agree on their exact reporting requirements, I wanted to develop a method of user-specified options for generating the various reports, particularly for Part B. This was fought as being too vague and too hard to understand.

Yet, perhaps twelve years later, when I was working for USLICO (to become ReliaStar and ING) in Arlington VA, end user specification was part of the philosophy of the salary deduction system (billing and collections), all of which was handled by very well structured COBOL programs (this was even the old COBOL 85). Some of the business was to be put on PC's with Microfocus so that business users would have ultimate control of their business relationships with clients.

So the "philosophy" of computing that I tried to sell in 1981 in a contentious political climate among the non-profit BCBS Plans (and we all know how political BCBS plans are) was becoming everyday design by the early 90s, still in the mainframe world, and would be well embraced by vendors like Dun and Bradstreet MSA, which we used for accounts payables. 4GL's like IE (Information Expert) were being developed then to paramterize everything for easy installation in mainframe environments.

Of course, we all know how Internet thinking took over everything by the late 90s, when everybody was writing GUI's with replication/midtiers or direct connect.

But actually, user-defined computing was pretty well known by the mid to late 1970s. At NBC (National Broadcasting Company), on a Univac 1110 environment, by 1976 we had a General Ledger system from a San Diego company named Infonational, which had heavily paramterized ways of setting up the Chart of Accounts for the various accounting proofs, and which offered a Report Writer (complete with a reconciliation step) that paramterized all of the financial analysis report.

The BCBS environment for the penultimate Medicare system did try to develop a state-of-the-art IMS implementation, which in 1981 was what you had. (Yup -- IMS and CICS, although an alternative for the TP monitor was IMS/DC). They were even going for the ultimate flexibility of "field level sensitivity" in the PCB's, which IBM explains at this link.

Thursday, November 09, 2006

Tech Republic offers white paper on older v younger IT workers

Subscribers to the Tech Republic newsletters can get a PDF file (free) about the cultural conflicts within information technology between older and younger workers. The paper suggests that spot shortages in legacy skills like COBOL are increasing outside of major cities as older programmers leave the market in the wake of the 2002 crash. The paper also suggests that older workers often are better at "connecting the dots" and seeing unwarranted business or legal risks in certain ways of doing things--"know the damage that you can do."

The paper can be accessed here.

The paper is from Forrester Research and it is titled "CIOs: Avoid War Between IT's Twentysomethings and More Mature Workers.: This is not legally driven or concerned with age discrimination laws; it is more concerned with business productivity. However, in my opinion, discrimination laws may have driven the extreme specificity of job requirements for many contracts, especially with state governments for MMIS and welfare department IT contracts. That is because if the laundry list of a position is detailed and specific, and if you fill the position with the closest match (regardless of any characteristics of the job applicant), the theory goes that you are legally safer from any kind of employment litigation. However, in practice some skills are more likely to be held by applicants within a given age range, and this would make one wonder about possible disparate impact claims.

To view the content you will need a Tech Republic subscription.

Monday, November 06, 2006

Review topic: COBOL, PERFORM issues; leap year

I have found notes online that maintain that PERFORM UNTIL is equivalent to PERFORM WITH TEST BEFORE. This has always been the "common sense" undertanding of the UNTIL clause.

Murach, Structured Cobol, 2000, documents PERFORM UNTIL TEST AFTER on p. 197.

CALL statements default to "BY REFERENCE" but "BY VALUE" is used to call subprograms in other languages like C and C++ (usually on Unix systems). It is important when calling by value to make all the lenghts of the fields match, or you can get memory exceptions (S0C4 on the mainframe).

Here is a reference

I found a reference to the issue on IBM mainframes here.

2000 was a leap year, Reference:

Therefore, code to compute a leap year like this should work thru 2099:
DIVIDE MYYEAR BY 4 GIVING MYNUMBER REMAINDER MYLEAP
(all the MY--- are numeric)

However, sometimes, as in actuarial calculations about present value or life expectancy (as in insurance companies) we do have to worry about the year 2100, so you would need to test for centuries (other than 2000) that are not leap years because of the precise length of time the revolution of the earth around the sun takes. In 2100, we could have a "mini Y2K" problem.

Thursday, November 02, 2006

Retrospect of days with DOS assembler

The fourth movement of Brahms's Third Piano Sonata in f minor (Op 5) is a dirge in B-flat minor called "Retrospect." Some of these entries on this blog recreate a few situations in my 30-year mainframe career.

In the mid 1980s, I worked at a credit reporting company in Dallas, and one of the things I did was convert the monthly and daily billing systems from DOS Assembler to OS Assembler. (Then we replaced the Daily Billing with COBOL). Now where OS has DCB's, DOS had DTF's, and the job ran under OS jcl as PGM=DUO with the actual program name an execution parm. That is, DUO emulation. If you wanted to update the files, you ran with the UPSI execution parameter turned on. You had to be careful with dataset nodes, because in the mid 1980s there was no security preventing accidental update of production files.

Shortly after the OS conversion of Monthly Billing (I actually babysat the implementation New Years Day 1986 right after my father had died, before getting on a plane to fly home), we ran into an interesting problem with halfword or fullword boundaries in defined storage in assembler. It seemed that certain instructions did not work properly, and as a result quarterly fixed billing for some members did not work, but we did not find out about that until after the fact, mid month. So we had to rerun a lot of stuff. A programmer had made a change, causing the DS alignment to go off.

The other big technical controversy was addressability. I heard horror tales of Y2K conversions of ALC programs with up to seven base registers! We would use the technique of a temporary base register, with R15 as a pivot point.

In those days, the Bible for Assembler Language programming was George Struble, "Assembler Language Programming: The IBM System 360 and 370, 2nd Edition", from Addison-Wesley, 1969 and 1975. How many programmers learned, say, floating point arithmetic in IBM mainframe assembler? Not many. Today the many client that uses Assembler seems to be the IRS, and usually when they look for people they need extreme proficiency in it, including all of the mathematical instructions.

Wednesday, November 01, 2006

The most common COBOL interview question

The most common technical interview question that I would get over the years was "What's the difference between an ordinary SEARCH and a SEARCH ALL? What precautions are necessary in using them?"

Of course, we all know that a SEARCH ALL is a binary search, and that the items must be in sequence for a binary search (successive splits of the card deck) to work. Binary logic -- and the mathematics of logarithms and exponents -- is what makes search engines today work.

In 1988/1989, I was in a job where the consulting company (which produced hospital Medicare operating margin reports for clients) was charged for its CPU time, and I reduced the CPU time for a particular outlier job by reading a VSAM file sequentially into a table, and using a binary search. The job ran much faster and cost much less. At one point, one day in January 1989, we had, to even stay in business, to rerun a simulation quickly and get correct simulation results after changing some parameters based on the Federal Register. Fortunately, I had made this change and the successful rerun was done in less than an hour.

IDMS: a cautionary tale about the Central Version

I have gotten a call of two from recruiters about some old mainframe experience that I had with IDMS. This was a network database model with Schemas, and had at one time belonged to Cullinane Corporation. It should not be confused with IBM's IMS, which is a hierarchal database model.

In fact, in the 1970s Sperry Univac had owned a similar product, DMS-1100, which had the same kind of structure, and which was intended for its 1100 series computers (1108, 1110), running under Exec8, which was a command-like JCL somewhat like Linux today (simpler than IBM mainframe with its verbosity). In fact, Exec8 offered an automated JCL generator called SSG in the 1970s, which was more flexible than IBM's concept of Procs. While working for Univac, I did a training exercise putting my classical record library on DMS-1100, something easily done today with products like MySQL. In the 1970s, NBC (National Broadcasting Company) was a Univac shop (relatively rare in New York City then) and used DMS-1100 and SSG heavily, as well as "Ascii COBOL".

IDMS could work with either its own format, or with its VSAM transparency, where the data was actually on VSAM files that could be manipulated with standard IBM utilities (IDCAMS). On the salary deduction billing and collection system in the early 1990s, I worked with the VSAM transparency.

Database access could be controlled with either the Central Version (the "CV"), with the inclusion of one specific CV dataset and a specific DD name in a specific STEP in the JCL, or in local mode, in which the actual datasets (in our case, VSAM files) had to be named in the specific STEP (or for the whole job.

When running in local mode, normal security worked. A programmer could not update the production database without specific access. But in the Central Version there was a loophole, because in the 90s at least, Top Secret could see past the CV control dataset to protect update access with the specific VSAM files. There was a risk that with a wrong CV dataset (pointing to the production files) anyone could update billing or collection data with an ordinary submitted batch job, and that the update would not even be noticed for a long time, making recovery very difficult. I don't know if IDMS eventually fixed this.

Sunday, October 22, 2006

A cautionary tale about tape labels

I recall that in the 1980s, at a Texas credit reporting company (then, Chilton), we had gotten used to coding a label parameter to save tape datasets as LABEL=EXPDT-99000 to assume catalog control. That is, the tape could not be written over unless uncatalogued. We had used a third party vendor tape management system.

In 1988 I took a new job in a small shop in Washington DC that had a 4341 and 4381, which at the time were the small mainframe computers that could run MVS (or VM, which actually emulated a PC on the mainframe). I had created and saved a lot of data tapes and coded them the same way. Lo and behold, one summer evening in 1989 (after a small data center move) I find out that all of the tapes are unprotected. Fortunately, none of them had yet been written over. But if they had been, a whole business could have been lost. My career could have ended right then and there.

The new shop used IBM's own standard only, which at the time were EXPDT=yyddd, where 99365 or 99366 kept the tape indefinitely.

It appears that all of this changed with Y2K. The correct way to code is yyyy/ddd (1999/365 or 1999/366 still gives permanent retention).

The lesson is, if you are a consultant and work in a shop where you have some responsibility for manually managing data resources, be very sure that you know the exact rules for your shop. And know the official IBM standards, which may appear on certification quizzes.

Wednesday, October 18, 2006

A sample DB to illustrate some joins (WIP)

The example here supposes that we are building a database of political theory. We are trying to classify the arguments that people try to make to support their political or social positions. This could be build up into a social studies learning tool.

We may have to tinker with the basic setup from time to time to make some of the examples applicable. For example, outer joins wouldn’t make sense if you had already enforced referential integrity with foreign keys.

Future blog entries may refer to this setup.

DECLARE TABLE(DB01.ARGUMENTS)
LIBRARY(DB01.DCGLGENS(ARGUMENT))
ACTION(REPLACE)
LANGUAGE(COBOL)
STRUCTURE(ARGUMENT-ROW)
QUOTE

EXEC SQL DECLARE DB01.ARGUMENT TABLE
(TOPIC CHAR(8) NOT NULL,
SUBTOPIC CHAR(8) NOT NULL,
BIBID CHAR(8) NOT NULL,
ARGDATE DATE,
ARGID SMALLINT,
COUNTERARGID SMALLINT,
ARGTEXT VARCHAR(1024))
PRIMARY KEY (ARGID),
FOREIGN KEY ITEMID (BIBID)
REFERENCES DB01.BIBLIO
ON DELETE CASCADE)

(Varchar should normally be at the end of a table)

DECLARE TABLE(DB01.BIBLIO)
LIBRARY(DB01.DCGLGENS(ARGUMENT))
ACTION(REPLACE)
LANGUAGE(COBOL)
STRUCTURE(ARGUMENT-ROW)
QUOTE

EXEC SQL DECLARE DB01.BIBLIO TABLE
(ITEMID CHAR(8) NOT NULL,
AUTHOR CHAR(32) NOT NULL,
TITLE CHAR(32) NOT NULL,
SERIES CHAR(32) NOT NULL,
URL CHAR (64) NOT NULL,
PUBDATE DATE
PRIMARY KEY (ITEMID))

Imagine the tables to contain these entries:

ARGUMENT

1amend,Internet,00000001,2006-01-01,11,12, “Children should be protected on the Internet”
1amend,Internet,00000002,2006-01-01,12,11, “filters will protect children”
1amend,Internet,00000003,2006-01-01,13,11, “content labeling will protect children”
service,draft,00000011,2006-02-28,21,22, “military service is am obligation of citizenship”
service,draft,00000012,2006-02-28,22,21, “the draft is involuntary servitude”

BIBLIO

00000001,”Bill Boushka” “DADT” “DADT”, “http://www.doaskdotell.com”, 1997-07-11
00000002, “John Doe” “Autobiography of John Doe”,,2001-01-01
00000003, “John Smith”, “Auto JS”, ,2001-01-01
00000011, “Mary Smith”, “Auto MS”, ,2002-01-01
00000012, “Ellen Smith”, “Auto ES”, ,2003-01-01
00000013, “Jerry Smith”, “Auto KS”, ,2002-01-01
00000014, “Dan Smith”, “Auto DS”, ,2003-01-01

An inner join:

SELECT AUTHOR ARGTEXT
FROM DB01.BIBLIO
INNER JOIN DB01.ARGUMENT
ON ITEMID = BIBID
Produces arguments where there are matching authors, and only names the authors that have arguments.

An outer join is useful, in practical terms, in identifying potential future matches. For this example, pretend that there is no foreign key clause forcing every argument to have an author.

SELECT A.AUTHOR, B.ARGTEXT
FROM DB01.BIBLIO A
LEFT JOIN DB01.ARGUMENT B
ON A.ITEMID = B.BIBID

This join would return all authors as potential matches but only the arguments authored by matched authors. If you coded “right” you would get the same result if the tables were switched in the select statement. Full outer joins allow only an equality operation, but left and right joins can use any comparison operator.

A good reference is this:

Tuesday, October 10, 2006

Review topic: DB2, date and time

Some quizzes on DB2 programming may test knowledge of date formats. Sometimes there is confusion about character and numeric representations.

A DATE data type in DB2 is represented in character format in Y2K chronological: “yyyy-mm-dd”. That is 10 characters in COBOL. The order will always be in logical chronological sequence.

In a similar fashion, a TIME data type is simply hh-mm-ss, 8 characters in COBOL.

However, these types have the implicit ability to generate durations (when two different items are “subtracted”) and to be modified by adding and subtracting durations. DB2 regards as date duration as DEC(8,0) which would be hosted by COBOL in a variable as PIC S9(8) COMP-3. Time duration would be regarded as DEC (6,0) and have a COBOL picture of PIC S9(6) COMP-3. (The pictures could obviously be S9(9) COMP-3 and S9(7) COMP-3., to fit within EBCDIC)

By the way, here is a good Ascii-EBCDIC chart.

Review topic: CICS file types; the RIDFLD option

CICS transactions reference data either through a commercial database package (DB2, IMS, Adadbas, IDMS, Datacom DB) or through VSAM itself. Some commercial packages, like Vantage for life insurance and annuities (or the older and much less successful VLN) access VSAM through their own proprietary I/O modules, with which application programmers in a shop using these packages must become very proficient.

Nevertheless, recruiters for mainframe jobs always expect proficiency in VSAM.

A good reference book for VSAM programming is a red paperback, Practical VSAM for Roday’s Programmers, by James G. Janossy and Richard E. Guzik, published by Wiley in 1988, ISBN 0-471-85107-8. The book gives a lot of historical background and “philosophy” behind VSAM in connection with both the old DOS and MVS as it developed throughout the 1970s and 1980s.

Technical interviewers are likely to expect a body of technical knowledge on how to access various VSAM formats from CICS transactions. In many shops, however, programmers often use only a portion of the technical knowledge that recruiters and certification tests may expect the applicant to demonstrate. This is particularly true of maintenance programmers.

For example, consider VSAM formats. By far the most common is the Key Sequenced dataset, the KSDS. But older formats include the Entry Sequenced Data Set, essentially a sequential file, and the Relative Record Data Set, RRDS, related to the old BDAM. With an ESDS, the Delete command cannot be used, and neither can the START, since an ESDS can only be accessed sequentially from the beginning of the file. These are available for the RRDS.

An important concept will be the RIDFLD, the “record ID field.” Applicants may need to be familiar with the RRN (relative record number) or RBS (relative byte address) options, that may occur on some CICS commands, in connection with the RIDFLD.

When RRN is specified, the RIDFLD specifies a relative record number, and the file must be a RRDS.

When RBA is specified, the file is an ESDS, and the RIDFLD is a relative-byte address.

With the KSDS, however (again, by far the most common), the RIDFLD must give the name of the field that contains the key value for the record (it could be an alternate index).

When you want to read a KSDS sequentially, you must use a STARTBR command (like the COBOL START, which is optional in batch COBOL if you want to start at the “beginning” of the file.)

Monday, October 09, 2006

Review topics: Change Control

I am starting to summarize a few mainframe review questions that could come up in phone interviews or in technical quizzes given by recruiters. This is all part of a review process that may lead to my own recertification in a job search.

Topic: Change Control

The skill most often requested is Endeavor. I am familiar with ChangeMan and CA-Librarian.

I became familiar with ChangeMan in the mid 1990s at ReliaStar. The package automatically stages components of an elevation (source, link decks, parmlibs, procs, JCL, etc) into various stages of Test, QA, and production. The package will not let you compile and link without linking the source. This automatically guarantees that when a package is moved to production, the source matches the load modules (a security issue). That to me sounds like a good concept for an interview question.

I first had CA-Librarian in 1990. At the time, one had to "process" a module to freeze, and it was possible to produce load modules that could be elevated without freezing the source. On the other hand, a source module itself could not be elevated without being frozen individually. This does sound like it could expose an installation to security problems, and I don't know whether this has been changed since then. I suspect that it has.

More review topics will follow.

Monday, October 02, 2006

Can techies sell?

Bob Weinstein’s Tech Watch column (one place is The Washington Times, Recruitment Times, Oct. 2, 2006) discusses “companies need techies who can sell.” Indeed they do. It is easier to budget a position that is at least partially compensated by commissions, particularly as the economy just starts to uptick..

As a partially retired “baby boomer” from the largely mainframe IT market, I have been asked why I am not willing to “sell” what I worked on. Become a life insurance agent. Or a software salesman. That sounds like a very natural thing as you age out of the fast paced geeky coder market.

A short answer is that sales for sales sake is against my temperament. I am a Myers-Briggs blue, the artist. I like to create the content and get it published. I like to find the truth, disseminate it, and get people to deal with what the truth means.

Of course, in sales, you are paid to represent in public just one point of view about something. It is a Faustian deal, where you give up all pretense of objectivity that you learned in school. I see many want ads for salespeople, for people who have proved that they can sell anything. No technical experience in the hiring company’s products or services needed. Sales culture is its own mindset, as so well demonstrated in the little art film “100 Mile Rule.” (My review is here.) It does seem to be predicated on socializing for its own sake, and on manipulating people and their perceptions and on hiding the truth. From my perspective it sounds cheesy, but that’s relativistic. Always be closing!

A couple of points, here, though. Artists still need sales and people skills to sell their own work – just read all the success stories about Hollywood in the popular fashion magazines. And it is perfectly all right to help someone else’s work if that person’s work has a connection to your own, and you can work in synergy. That’s a good and necessary strategy.

Furthermore, technology sales people have to keep as current as the coders. They have to help non-technical business executives solve technology problems. There is a legitimate place for this, even if it doesn’t fit my own temperament. I think, though, that for a lot of ex-techies, selling someone else’s content and not your own represents a major psychological challenge.

Tuesday, September 26, 2006

Are mainframe programmers welcome back? -- Maybe so, and then some

Can I go back to mainframe programming after 4+ years? (The grand plan)

You bet. COBOL and IBM mainframe JCL are verbose, but once you’ve done them for a couple decades, they stay with you forever. Their wordiness is really a kind of self-documentation, which helps long term skill retention.

But as said in earlier entries in this blog, we have to look at the market, and how that comports with personal temperament, in this case, mine. Gradually younger professionals replaced centralized mainframe processing culture with distributed, user or customer driven culture that favored client-server, OOP, and modernism – an languages that actually look cryptic. After the Y2K crunch was over, general demand for mainframe applications developers seem to nosedive for generalists.

At the same time, we saw recruiters troll the country for specialists in orphaned technologies. Although conventional wisdom used to preach diversification as the tocket to job mobility, now it was over-specialization, the willingness to stick with something others perceived as outmoded. The market will live in paradoxes.

“General practitioners” in mainframe technology seemed to become almost unemployable. In-house systems were replaced by specialized packages (like Vantage in life insurance) requiring nitpicky knowledge. Other maintenance operations could be easily offshored.

However since about the beginning of 2006 it seems that there has been an upsurge in demand for mainframe skills. I get many more calls than I used to. I could use better income. I was the super GP, the “family practice” of mainframe computing. Can I go back after 4-1/2 years?

I can be philosophical and claim that the business cycle is a natural, “Darwinian” weeding out, rank-and-yank process. People who drop out are the failures, and have to learn something else, and start over – or, if they are writers and artists, they need to start actually making people pay for their work. Nevertheless, there are a couple of developments that could make mainframe work again.

One is if companies again respect the twenty-plus years of basic big data center experience more than they have recently. I had to develop the maturity of the right way to test, implement and support changes to huge systems that ran huge volumes in every nightly cycle, and deal with the night calls that you did not get paid for (salaried). I had to learn how to make elevations goof-proof. It can be done, but it takes tremendous maturity, judgment, and a “best practices” mentality, as well as absolutely correct use of change control software to guarantee the integrity of the systems. You have to be conscious of the damage you could do. I don’t think that the young hires in a lot of situations have this kind of judgment.

So is the mainframe market for “experienced data center generalists” recovering?

The other situation would occur if the number of “generalist” mainframe positions is actually increasing, and if so many former programmers now in their 50s and 60s have moved on to other things (in my case, trying to get a film made). It used to be a matter of speculation that employers would eventually regret forcing out so many experienced professionals in their 50s with large severance agreements during downsizings. They would need them, and the experience would be gone, especially with older technologies. There seems to be some anecdotal evidence that this is now really going on. Not all offshoring agreements have produced the savings once expected. Market forces, and not just unions, could well reverse the trend to offshore many jobs.

The other possibility for bridging the 4-1/2 year gap is to take certification tests. Brainbench offers certification tests in a wide variety of languages and disciplines. I had taken COBOL, JCL, and ANSI-SQL in 2002, but the certifications expire in three years.

What about advancement?

Well, that is a touchy subject. Let me say right off, that I am not interested in advancement just for the sake of proving that I can “compete” and own a span of control, a domain of people. That does not fit my personality. Companies, when they hire consultants, say they want people to code, test, and implement – they want individual contributors. In many cases, the programmers will be employed by consulting firms that will want to market the contractors publicly as professionals in specialized expertise areas. Normally, that would seem to be the acceptable notion of advancement: to advocate a specific expertise.

Of course, I have been re-inventing myself very publicly in a few controversial areas. Here is not the place to advocate any particular political opinion, other that to express the underlying idea that technology can help the public understand the different modes of the way people think about all kinds of things. Technology can democratize, and it can help reduce the polarization of many political issues that seem to pander to special interests when they are handled by large organizations and lobbying groups, and become partisan.

That’s actually were I would prefer to show leadership. I can think of at least four areas where this gets more specific.

(1) Develop a system to help financial institutions and lenders perform due diligence when granting consumer credit, by checking applications against National Change of Address and performing mandatory consumer notification. The desired end result is to relieve consumers of the worry of having to shred their snail mail.

(2) Develop a cost-effective system to enable webmasters to label all of their content, and standards for all browsers to interpret the content. The end result would be much improved mechanisms to help parents protect children from inappropriate content while allowing adults free expression.

(3) Develop a database application (call it a “do ask do tell” database) to categorize and correlate political and social arguments made on a variety of contentious issues. Call this “knowledge management”, it would build on the open source, Wikipedia paradigm. But the aim is to give high school and college students in social studies tools to develop much more refined critical thinking and debate skills.

All of these projects would require the coordination of the work of a lot of vendors and software companies, with care user and beta testing. For any one project, one company would have to be “in charge.” The projects would require detailed project management. Sorry, I never was a formal project manager. People get certified in project management, too.

Another idea (4), is to develop “best practices” for employer and school-based blogging policies. This has suddenly become a big controversy.

There is a reason that I would want to take leadership ownership on something like one of these ideas, because I have done a lot of research and writing (on my own sites) about all of these since “retirement” at the end of 2001. It all grew out of my 1997 “Do Ask Do Tell” book which grew out of a most subtle political issue. You can read about that at my sites.

Oh, yes, I do want to get the movie made.

Wednesday, September 20, 2006

Specialize! The paradoxes of the mainframe I.T. job market


The job market mid to late 2006 sounds like a world of crazy contradictions. Recruiters call, and seem (sometimes speaking English in a way hard to understand on a cell phone, at least) anxious to submit my resume, even if I haven't worked in a formal mainframe job for 4-1/2 years. For that one reason alone, I suppose, many of the submissions don't go anywhere. There is a second main reason, however. My experience, however lengthy, is not specialized enough in one specific area.

In the mainframe area it is easy to identify some specializations where employers are eager to find people who will relocate temporarily for long term contracts. Most of these would be W-2, and the contractor would be responsible for housing himself/herself in an extended stay corporate apartment out of the rate. (Generally, I am told, that would not be excluded from taxation if the assignment went over 364 days in a year.)

Some of the specializations are MMIS (Medicaid Management Information System), but practically all states require that a contractor have a minimum of 24 months MMIS experience, preferably within the past five or so years. These jobs favor people who got into the MMIS market and stayed there.

Another speciality would be DB2 internals, support, and fine tuning, since high transaction volume DB2 systems performance is often very sensitive to design decicisions (like what kind of joins to use in selects).

Another one that occurs is Vantage for Life Insurance and Annuities. There is a "Vantage Rules the World" problem in the life insurance business. Vantage has its own propietary IO access methods (whether to VSAM or to DB2) with specific conventions in call statements and link editing that must be followed exactly, and are often hard to understand, requiring people to develop narrow expertise. So again, when an employer needs a Vantage consultant, it needs him/her badly, but the person must have very specific experience. There was a similar older system, VLN, that was developed in the 1980s, but the company went under, and Vantage took over the market.

You also find desire for experience with welfare and social service systems in state governments, and these often require experience with Case Tools, which were all the rage in the early 1990s.

Finally, you will find cravings for experience with orphaned mainframe technologies that have largely been replaced, but for companies still having them, they have a filling needs. One example would be IMS, which seems cumbersome to people today, especially IMS-DC.

In the post 9/11 period, we heard a lot about companies sending mainframe maintenance and technical support overseas to India, even production support (night time cycles could be maintained by people working days with the 12 hour time difference). There are open questions as to how effective this has been and whether some of this work is coming back.

But some of these observations explain the nature of the mainframe market today. It favors single people, without family care obligations, who can leave home and live alone for months at a time (with high hourly rates but without fringe benefits). Some will question the social responsibility of this kind of employment practice; but it is not intentional from the employers, it is more an anamoly from how things have evolved. Curiously, it is an outgrowth of what we call libertarian culture.

Friday, September 15, 2006

Could social networking profiles affect IT jobs?


The media has drenched us with warnings about employers checking social networking site profiles (Myspace, Facebook) and personal blogs, and even "search engine tracks" of job applicants.

This has become a topic that can be perceived in more than one way. It's understandable that someone in a sales job, who visits client sites or goes into people's homes, who works with small children, or who speaks for the company -- could embarrass the company and disturb clients or customers with over-the-top personal information that clients can find easily these days with search engines.

I.T. people often work in purely technical tasks, and often remain inhouse. The big part of an IT job is often technical perfection, and error-free implementation and support. It takes a focused, detail-oriented personality that sometimes appeals to more introverted people, who often are going to be drawn to artistic and literary pursuits in their own lives. On the other hand, IT people have to work in teams, although the teams are often internal.

The practice over the years has been to fill many development positions with W-2 contractors, who will work at a site for six months to a year. Typically, a recruiter contacts someone based on his online resume (usually at a site like Dice.com) and the person is not hired until chosen by the client. But after the first job, with some contracting companies, the professional goes "on the bench" and sometimes is paid for bench time, which can be used for training (particularly with a job that offers salaries and benefits). A company like this will want to be offer the resumes of the contractors that it uses repeatedly. This has been common practice for decades. But now, the Internet is there, and the notion that a person's online presence should be professionally managed is emerging.

I have not heard much specifically about this, but one can see where it is heading.

Here is a February 12 column by Mary Ellen Slayter of The Washington Post on this issue. (It may require an online subscription.) She mentions Tim DeMello, who has been developing a service to maintain profiles (Ziggs.com). Would companies eventually force most professionals to surrender "right of publicity" to services like this? I wonder.

I weigh in some more on this in another online essay at another pilot site.

Tuesday, September 12, 2006

Some Internet and textbook resources to practice for technical quizzes

Here are a few Internet sites that I found with multiple choice and short-answer drill questions on typical mainframe IBM technical topics. Some agencies do skills testing, so these references would give the job candidate a good start.

AOL Member Cronid gives a news reference here.
Click on DB2 for a short-answer DB2 quiz. There is a multiple choice quiz on CICS, and quizzes on COBOL, JCL, SAS, VSAM and other topics in the left hand frame column.

CICS (Command Level): Cronid on AOL gives this quiz. He provides the multiple choice answer key at intervals after groups of questions.

Joe Geller has a similar home page for DB2 and relational dababases.

Here is his link to DB2 Quiz #1 which puts the questions and answers in different frames.

These quizzes can provide a good refresher.

I recall in 2002 being asked in a phone screening to explain what an indexible predicate is, and to discuss when one would use an outer join.


The textbooks that I prefer are largely come from Mike Murach & Associates, the technical books publisher with the "M" trademark.

For example:

Murach's Structured Cobol (with Mike Murach, Anne Prince, Raul Menendez, 2000)
(note the section on Object-oriented COBOL at the end)

Murach's OS/390 and z/OS JCL (with Raul Menendez and Goug Lowe, 200)

Murach's CICS for the COBOL Programmer (Menendez, Lowe, 2001)

Edward A Kelly: An Invitation to MVS Using COBOL, TAB, 1989.

Murach: Curtis Garvin and Steve Eckols. DB2 for the COBOL Programmer, 1999

Steve Eckols, IMS for the COBOL Programmer, 1985.

Friday, September 08, 2006

Upcoming job fairs – Northern Virginia

I will start adding detailed entries to this blog, as they may be helpful to job seekers. That means that earlier essays on this blog may drop off into archives. I will provide an index to all of the old links here.

Dice.com provides an advisory about two fairs next week:

The company that coordinates and registers people for these fairs is Targetedjobfairs.com

There will be job fairs at
Tuesday, September 12, 2006
11 a.m. to 3 p.m.
Dulles Expo & Conference Center
4368 Chantilly Center
Chantilly, VA 20153

This appears to be near the intersection of Route 50 and Route 28. Mapquest gives
4342-4399] Chantilly Shopping Ctr
Chantilly, VA 20151, US

The specific registration link for this event is at Targetedjobfairs Targetedjobfairs
(specific link)

Wednesday, September 13, 2006
11 a.m. to 3 p.m.
Sheraton National Hotel
900 South Orme Street
Arlington, VA 22204-4520

This location is off Washington Blvd, between Arlinton Cemetery and I-395. I believe it is at some distance (about one mile) from the Pentagon or Pentagon City Metro stops. I used to live near Glebe Road an I-395 in a highrise and there was a bus up Army Navy Drive.

The specific registration appears to be at Targetedjobfairs (specific link)
It would appear that these events are open to professionals without clearances.

Sunday, March 26, 2006

The Schizophrenic Job Market

Let’s face it. The culture of the computer professional world in the last three decades of the Twentieth Century was a perfect fit for the introvert (like me). It emphasized truthful analysis and perfection at the expense of sociability. There was a culture that you could make good money forever as an individual contributor, avoid management and sales cultures, and live your own life without social pressures; the job had no public implications. And it provided open entry. There was an unusual need for people who could fill these positions, starting with the Cold War buildup even in the late 50s, which was translating to general commercial use by the late 60s.

This all remained pretty much intact until after Y2K. We called the network of positions and people “The Business.” There were dips, to be sure, especially in the late 1980s when hostile takeovers and sudden mergers eliminated many IT departments and applications were merged. But the mergers in the 90s were different, however, as data tended to be merged at the user presentation layer rather than in the legacy applications themselves. In the 90s we heard about “the War for Talent.” The explosion of the Internet and the Y2K crunch kept the job market artificially elevated until just after the 2000 New Year. We all know what happened: financial scandals, the bursting of the Internet bubble, and 9/11. Broadband allowed companies to send routine programming and maintenance, even production support, jobs overseas, for much lower labor costs.

Today recruiters regularly call me about specific items in my resume, most commonly my Medicaid MMIS experience in the late 1970s. (That is telling, because what I learned from checking numbers from my MARS nursing home reports then predicted the demographic eldercare crisis we have today.) Many of the jobs are “W-2” contracts, less than a year, in which the associate is paid an hourly rate and is expected to pay his or her own temporary housing expenses out of that (it raises some tax questions, too). It usually turns out that the client wants a very specific list of skills, often hard to find; otherwise the client will find plenty of people in its own geographic area.

Recruiters often work for staffing firms that present themselves as providing technical staff to client companies needing specific skill sets. However a candidate is almost never hired until requested by a specific client for a specific job. There are cases where the candidate becomes a salaried employee of the staffing company with benefits, and can be "on the bench" and paid between assignments, but this is not as common as it was in the 80s and 90s. Recruiting companies could become more efficient by communicating more honestly to candidates what clients really need, rather than hiring based on such short-term needs.

In the year 2000, I attempted to make the “switch” from “mainframe” to “client server” in a support environment where I would have to pick up a lot of skills OJT but apply them in a very superficial way. This turned out to be a mistake. I had thought that there would be value in combining mainframe with client server, but my exposure was just too superficial. I would have done better staying in mainframe and concentrating, say, of DB2 technical expertise.

So one of the lessons is: specialize. College and graduate students have the opportunity to work with their professors and guidance counselors and figure out specifically what skills employers want. (Security is a biggie. So is portability.)They should do so. But we have to go beyond the Geek and get to the Beauty.

Employers today are also likely to look for a track record of upward progression and advancement, and commitment to the field. In the late 1980s, middle management was not the place to be; it was the grunts who “did the work” whose jobs were safer, particularly if they had the mainstream skills. Today, it is different. Employers want to see evidence of leadership and socialization.

One issue that comes into consideration is project management. At one time this was viewed by some people as an area for those who were less “geeky.” Project management became a discipline itself, with its own sophisticated software and certification tests. But today’s biggest challenges and job opportunities may come for people who can get different companies and other entities like advocacy organizations and governments to wok together. There are bigtime problems resolving legal issues associated with the way people use new technology that will require companies to work together in unprecedented fashion.

Another is a bad word: sales. Vendors are looking for technically versed sales professionals. That has a negative connotation for many techies, who feel that their lives become Faustian bargains of letting an employer take over their public identities. Understood. But times are differently changing.

Older professionals or persons who have "retired" will need to define themselves carefully, and base their strategies on what they have to offer as unique individuals based on a large mixture of lifetime experience. They generally won't match the narrow qualification lists on jobs boards that are more appropriate for younger people.

Dr. Phil McGraw asid on bis NBC show, “Winners do things losers don’t want to do” and that seems to apply to the job market!

Friday, March 24, 2006

Certification of computer professionals


Computer programming has always been relatively unregulated and has effectively offered “free entry.” In some cases no college degree was required. Many other professions are regulated by state licenses. Early on, major employers like IBM and EDS tried to implement ideas of professionalism, particularly with strict dress codes, that loosened gradually over the years to the point that casual dress had become common even by the 1980s.

Today, software vendors offer a variety of certification exams. For hardware, the A+ certification is popular even with high school students, who sometimes can get jobs as computer technicians even while in high school. Some of the certification exams are strenuous, such as Sun’s java certification tests, which include both multiple choice covering all language capabilities and require a short development project that is uploaded. There has always been a tendency for software programmers to master only what they need on a specific job, where as certification requires systematic mastery of all material.

Some companies, such as Brainbench, offer a wide list of certification exams in a large number of topics. Some headhunter and recruiting firms use Brainbench exams to test applicants for consulting positions. Software certification can help restore a sense of an applicant’s professionalism among employers.

The Institute for the Certification of Computer Professionals has offered relatively generic multiple choice examinations in a number of areas and languages for many years. Certification can be continued with education credits.

The personal computer and Internet revolutions

In December 1981 I bought my first personal computer, a Radio Shack TRS-80 with a 64-byte black and white screen, for $3700, including an okidata dot-matrix printer. In those days, just having word processing was a big deal, and you spent extra for a letter quality printer. The Apple and IBM (and IBM compatibles) were the top of the line, but other brands like Atari, Commodore and the Osborne would slip off. The DOS of IBM compatibles grew into the modern Microsoft operating systems in a series of steps over twenty years. Database applications became popular in the 80s, starting with dBase III+ which would add SQL to dBase IV, and Microrim’s rBase, which also offered SQL.

Business was definitely getting interested in smaller computers with more user-friendly operating systems. Unix had been around since the 60s and gradually became popular in academic and defense businesses, as did VAX/VMS. There were other experiments, like the MAI Basic 4 (with preloaded insurance systems) and various minis with reduced instruction sets and specified application environments, such as mortgage lending.

I think Compuserve was around with proprietary online content around 1983, but consumer online access really did not take off until the Internet was opened up in 1992, at the end of the first Bush administration. AOL and Prodigy were the major players at first, and I started using AOL in 1994 (by then I owned an IBM PS/2). AOL became dominant, along with MSN, Earthlink, and a few other companies, which by the middle 90s were using mainly html to offer content. Other protocols (like gopher) were dropping off.

By now, the two main PC paradigms were the PC with Microsoft operating systems (going from Windows 3.1 to Windows 95, Windows 98, and finally XP), and the Apple which developed souped up operating systems based on Linux, a flavor originally based on Unix.

In 1996, AOL offered its customers personal publisher, which would allow users to post their own authored content with anyone else with an Internet connection to see. But by around 1994 or so companies were already offering individuals their own domain registration and hosting. By the late 90s, individually owned domains were popular. By the late 1990s, companies realized that they had to include the Web as a major strategy in reaching their customers.

All along, the job market had been growing open-computing models and languages like C, Pascal and Perl. These were still procedural, but they tended to have a less verbose syntax and style than third generation languages used in financial applications. Soon OOP (object oriented programming) would become a new industry standard. C++ was the major OOP, to be partially displaced quickly by java, because of its portability. Java was introduced around 1995, but even by 1999 it was being used widely to develop midtier data access layers as companies merged operations from purchased companies at the presentation rather than application lawyers. The OSI model became important: for example see http://www.webopedia.com/quick_ref/OSI_Layers.asp . Object oriented programming tries to model a system almost as if it were a simulation, with classes, methods, and data objects. Older programmers sometimes have difficulty becoming fluent in it quickly.

The Internet became the “wild west” and the bubble would burst starting in 2001. It did attract all kinds of get-rich-quick schemes and bad actors, which left a whole slew of new issues involving security (the development of firewalls and anti-virus software), dealing with spam, and widespread copyright infringement associated with immature misuse of peer-to-peer computing, which also was developed, partly by teenagers and college students themselves, in the late 1990s. The rapid development of all of these technologies showed technical agility and brilliance by young programmers, who sometimes did not understand or respect the ethical issues that they were creating.

Actually, many of the better jobs tended to involve the standardization of data and passing it among businesses in partition-independent format with XML, and protocols like SOAP. Data from a mainframe computer could be sent to a graphical user interface after database extraction with direct connect and manipulation with XML.

But the most profound changes may be the power that the Internet gives to individuals. The same economic forces that led to old mainframe business operations becoming offshored (through broadband) also enabled individuals not only to set up their own transaction-oriented merchant enterprises but also to self-publish their ideas and outlooks, a socially much more radical innovation. Publication went from low-coast desktop publishing to print-on-demand to Web publishing, and migrated towards video and motion pictures, to the point that business models in the entertainment industry may eventually be affected more by legitimate Internet use from new artists than by piracy.

“Free entry” or at least low barrier to entry can have a major impact on many business models and social values and raises ethical issues of personal accountability that just haven’t been seen before. With the “myspace problem” we are seeing a new iceberg.

Mainframe job skills: historical survey

From the 1960s until well into the 1990s there was a well-established large scale computing culture in large companies, particularly financial institutions, life and health insurers and government agencies, centered around large mainframes and, at first, batch processing followed quickly by “demand” and “real time” – effectively online. At one time there were five major companies, dominated by IBM, and I tried to work for a competitor, Sperry Univac.

IBM beat out all of the competition by sheer size and a certain kind of disciplined professionalism. The job control language (JCL) developed a reputation with some people as verbose and unwieldy, but knowledge of JCL was an essential skill by the 1970s. In fact, at one time there was DOS, which was quickly overtaken my MVS. Many shops in the 1970s developed their own commercial applications in house, largely with COBOL and sometimes assembler; the applications would start out with daily and end-of-month batch cycles and followup with online access through TSO and then CICS. Mainframe databases – hierarchal like IMS or relational like DB2, and sometimes from other vendors (like IDMS, Adabas, or Datacom DB) became popular. CICS was indeed the main teleprocessing monitor, although IMS had its DC and there was Datacomm DC. By the early 1980s, most companies provided individual CRT terminals for employees to do their work “online.” Gradually, inhouse-written applications were replaced by packages from major mainframe software vendors.

Procedural programming rapidly became more efficient in the 1970s, as structured programming technologies and top-down testing techniques became published, and as so-called "fourth generation languages" like DYL280 and Easytrieve came into populartity. One such package, SAS, with its unique philosophy of data and procedure steps, became a comprehensive application develop system of its own, and is particularly popular in analyzing data for statistical patterns, as would be needed for public policy research in areas like health care.

Gradually, the job culture came to reward those programmers who developed skills in the mainstream IBM products, as well as major packages from a variety of closely related vendors like CA (for job scheduling and source management).

Security started to become a major issue in the 1980s. By around 1990, most major companies did not allow programmers to update production files without specific access, although there were many loopholes. The integrity of source management was understood as a significant security issue by the early 1990s.

Even as personal computers and the Internet became important, the job market in the mainframe area remained very strong in the 1990s because of the need for Y2K conversions. The year 2000 came with very few actual problems.

After the Y2K experience, it seems that many companies transferred much of their mainframe maintenance overseas. This even included nightcall support, which could be accomplished from India during their day shift. It seemed that for a while that the mainframe job market was almost vanishing. Yet, demand remained strong and paid top dollar in specialized areas where detailed technical expertise was required. These areas included particularly DB2, Vantage (for insurance and annuities). some of the Case tools, and sometimes, surprisingly, IMS, as well as data warehousing. Analysts with enough experience in specific business areas like Medicaid MMIS could get jobs.

There seemed to be a curious phenomenon here. Conventional wisdom in the 1980s and 1990s had suggested that the computer professional spread out into many areas, but after 2000 the market was rewarding extreme specialization. Companies would scour the country looking for specific skill sets in areas that are perceived as becoming obsolete, like IMS. There was a reward for staying with old skill sets, which was not what many people had expected.

There has been a lot of speculation as to whether the mainframe job market that we knew in the past will come back, whether companies will pull back operations that they offshored. Recruiters tell me that they do not, as of early 2006, see much evidence that this has happened. If it did happen, there would be an increase in demand for older professionals in their 50s and 60s.

Comparison of job skills values to real estate and stock values

The value of one’s skills in the information technology job market is volatile. On an open free market, the ability to earn a living though work sometimes seems a bit like the ability to increase wealth in equity markets or even in a house.

There’s a personal parable here. In the mid 1980s I bought a condo in a moderate income section of Dallas, with a new home price of about $40000. In the late 1980s I decided to leave for the DC area, partly because of a threat to my job from hostile takeovers. I rented it but I eventually sold it for $30000, for almost a $10000 loss. During the severe Texas real estate recession of the time, the value actually dipped into the teens. I don’t know what it is worth today, probably much more than I paid for it.

The saying is you buy low and sell high. I’m afraid that with this the opposite happened. I have never bought a house again. Today, of course, housing prices are at record levels in most of the country (I’m not sure about Dallas specifically today). The point is, sways in any market can cause people to have to drop out. Then the market comes back with newer players. It’s brutal, it sound cruel, but it’s just plain capitalism.

One needs to understand the same observation with respect to jobs.

My resume

John W. Boushka
4201 Wilson Blvd #110-688 Arlington VA 22203-1859

571-334-6107

JBoushka@aol.com

OBJECTIVE

Long term: Deployment of materials that help any customer understand and correlate all of the major positions that people take on many social, political and financial issues. It is important to understand why people think the way they do as well as to know their positions. I will be able to provide leadership in solving leading edge solutions to new legal and ethical problems associated with the deployment of technology, as often new social or legal conflicts may be resolved by further refining and improving information technology. Deployment strategies include software products and services such as content management, knowledge databases, films, and books. In many cases, it is important for me to draw on some unusual personal experiences as well as more conventional workplace technical experience.

Short term: A variety of interim positions in areas like technical writing, information technology applications programming, help desk or inbound call centers. Teaching may provide the opportunity to help high school and college students to develop critical thinking skills and to “connect the dots.”

I did take a retirement package from my last “major” corporate employer after a career in information technology applications in life insurance, health care, credit reporting, and financial reporting.


SUMMARY OF QUALIFICATIONS


A data processing business applications specialist with substantial experience in life insurance, health care, credit reporting, media, accounting, and public policy research systems. Have implemented major applications after full systems development life cycle on IBM mainframe (batch and on-line). Have provided technical leadership, specification of business and technical requirements, coding, unit and systems integration testing, implementation, documentation and on-call production support.

Have authored and published three books and numerous short articles on social policy issues relating to individual rights.

TECHNICAL EXPERIENCE


· IBM OS-390 mainframe with MVS (and VM): COBOL II, COBOLMVS, Assembler, JCL, FileAid, SyncSort

· DB2, IMS, IDMS, VSAM, CICS, DataComm DB and DC

· NCOA, Group-1 MOVEUpdate, USPS FASTForward, Nadis

· SAS, DYL280, Easytrieve, Dunn and Bradstreet MSA (for accounts payables) and Information Expert, ADSO (a code generator for IDMS), systems life-cycle management

· Unix, SQL with stored procedures, PowerBuilder, Java, test Director

· Matching of technical job skills to job descriptions (recruiting)


GENERAL EXPERIENCE



· Telephone support and problem solving for customers

· Telephone negotiations

· Book manuscript preparation and cost-effective printing

· Public policy research (individual and civil rights, discrimination, health care, security)

· Various small volunteer assignments (such as member database maintenance)

ACCOMPLISHMENTS

Insurance and Annuities:

· Reduced head count (by 2) for administration of billing of life insurance premiums to employers through salary deduction, by developing major reporting system enhancements.

· Reduced volume of return mail (by about 20%), by implementing new NCOA (National Change of Address) interface and by clientization of major Vantage system.

· Facilitated cross-selling, by installing new life insurance products on legacy systems.

· Enabled efficient customer service and cross-selling after merger, by implementing legacy replications to a common GUI customer service workbench

· Enhanced security of an accounts payable system with a new signature approval process.

· Reduced production outages and down time by 80% by careful testing and, when in a support role, carefully documenting recurring environmental problems; always provided dependable off-hours support as necessary.

· Consolidated commission statements provided to agents.

· Facilitated fraud reduction by developing and implementing Medicaid Management and Administrative Reporting System (MARS).

Credit Reporting:

· Enabled a credit reporting company to bill more efficiently by implementing consolidated billing to members across multiple bureaus and by implementing modern daily billing system.


Public Policy Research:


· Enabled small consulting operating to remain in business by quick application problem solving and by having set up the application to be run very efficiently

· Attracted client business to public policy consulting business by providing accurate reports on Medicare operating margins.

EMPLOYMENT HISTORY

Interim positions with Minnesota Orchestra Guaranty Fund (part-time, April 2002 to June 2003) as a fund raiser, and with RMA (Risk Management Alternatives) in Mendota Heights, MN (May 2003 to July 2003) as a debt collector for a telecommunications client; left voluntarily to return to Washington, DC area. Also wrote multiple choice certification test items for Brainbench on Business Ethics, early 2003. Substitute teacher in Virginia, from Spring 2004 to Dec. 2005 (See substitute teaching discussion.)

SELF 1997-Present

Self or cooperatively published three books and supported them with large website on Verio, as well as another experimental “java starter” website.

ING-Reliastar – Arlington, Va. and Minneapolis, MN 1990-2001

Life insurance and annuities

Senior Programmer Analyst


Implemented consolidated commission statements and salary deduction billing statements; implemented legacy replications to mid-tier; supported service center end users by trouble-shooting a wide variety of application and environmental production problems. Skills emphasized mainframe IBM (COBOLMVS, JCL, DB2, IMS, CICS) with client-server in last two years (Unix, Powerbuilder, java, Sybase stored procedures, screen emulations in C).

Lewin-ICF/Consolidated Consulting Group – Washington, DC 1988-1990

Public policy and social science research

Programmer Analyst

Duties consisted largely of fine-tuning policy simulation model for lobbying clients, with both COBOL and SAS, both mainframe and PC.

Chilton Corporation – Dallas, TX 1981-1988

(now this is Experian)

Credit reporting and credit bureau operations

Lead Programmer Analyst

Duties consisted of designing and implementing major revisions to daily and monthly billing, including consolidated billing, in a mainframe environment (COBOL, Assembler, Datacomm).


COMBINED A&B MEDICARE CONSORTIUM – Dallas, TX 1979-1981

Medicare postpayment utilization review

Systems Consultant

BRADFORD NATIONAL CORPORATION – New York, NY 1981-1988

Medicaid MMIS – MARS

Lead Programmer Analyst


NATIONAL BROADCASTING COMPANY – New York, NY 1974-1977

Media and broadcasting



Programmer Analyst; also television boom operator during strike duty


EDUCATION


I had Brainbench.com certifications in COBOL II, MVS JCL (June 2002), and ANSI SQL (July 2002 to June 2005)

The COBOL and JCL certifications expired in June 2005. The SQL certification expired in July 2005. I am evaluating whether I should re-test to renew them, according to market conditions and other goals.

Object Oriented Programming (OOP) training on the job - 1999


FLMI (Fellow, Life Management Institute (LOMA) – 1995

ICCP ACP and CCP certifications (1992-1998)

M.A. Mathematics – University of Kansas – 1968

Here is the link with detailed information about my Master’s Thesis, “Minimax Rational Function Approximation”.

B.S. Mathematics – George Washington University – 1966

Completed two courses (in Microsoft .NET with C#, and XML) in fall 2002 semester at Hennepin County Technical College (Minnesota)

Additional comments Simplified Intellectual Property Agreement (for employers)

The following is a scanable text resume that stresses information technology only. Text resume

Visit my “technical” blog at http://billboushka.blogspot.com