Thursday, November 30, 2006

My Brainbench COBOL certification

From the summer of 2002 to 2005 I had Brainbench certifications in COBOL, JCL, and ANSI SQL. I have started renewing these.

I passed the first certification (COBOL II) on Nov 20, 2006.

The information is:

COBOL II, score 3.70 (possible range is 1.0 to 5.0)

percentile: 87

Strengths: Data Division, Process statements, File processing

transcript link: Here (go to Brainbench link).

The next test will be JCL and I expect to get to it next week.

Thursday, November 16, 2006

Review: COBOL: Rounding and mathematical statements

I took both parts of the Brainbench COBOL II refresher Beta Test Thurs Nov 16 and scored in the 61-st percentile and then 93rd.

Consider this statement:

MULTIPLY FIELD1 BY FIELD2.
This is an old-fashioned way to code, and the result is stored in FIELD2. It is not rounded unless ROUNDED is specified. ON SIZE ERROR can specify a routine to perform if the picture clause for the receiving field is exceeded.

You can specify a REMAINDER clause on division (just as in grade school arithmetic), or use a REM function.

The COMPUTE statement usually makes code more readable, with parentheses. But you should know the order of operations (or order of precedence), which is the sames as in Algebra I ! This commonly is tested on multiple-choice quizzes.

Operations are performed from left to right. Exponentiation is done first. Then go back and do multiplications and divisions. Finally do additions and subtractions.

Wednesday, November 15, 2006

Review COBOL: External, Global Clauses

The External Clause is associated with object-oriented programming.

The EXTERNAL clause says that a data item or a file description item is external. The data items and group items of an external data record are accessible to every runtime element in the run unit that defines that record. This would mean that the items can be accessed by other items not included in a link deck.

A typical reference is here.

In Microfocus it is possible to secify EXTERNAL on the ASSIGN clause of the SELECT statement so that external files may be accessed. Link for Microfocus is here.

Review COBOL ENTRY and MVS abend list

The COBOL ENTRY statement allows a program to start at an alternate place. It is more common in Microfocus and on z/OS; on the mainframe people tend to code separate modules.

IBM has a definitive discussion at its Boulder CO site, here

It should be studied in conjunction with the CALL statement, reference here:

Note the USAGE IS PROCEDURE-POINTER and USAGE IS FUNCTION-POINTER.

This reference has a comprehensive list of the common MVS abends. The most common and notorious (for nightcall support programmers) is S0C7. But others are interesting. The "empty files" is usually an 0001. The S0C5 can happen when falling through an ENTRY statement.

Abend list

Monday, November 13, 2006

Review: Cobol (first quiz)

I took the first Brainbench refresher quiz Friday (Nov. 10) on COBOL II and scored in the 98 percentile. The quiz has 20 questions. For the refresher quiz, there is considerable time.

Just a few topics came up. I won't give away any test material, but I'll continue some review topics.

One idea is that the EVALUATE and WHEN structure can execute only one WHEN clause. It then stops. So the order of the clauses matters if more than one can be true (Murach p 198). This structure is often simpler and more elegant than the equivalent "Nested IF" structure. Professional consultants on contracts would probably be expected to code with this technique when possible.

If you want a program to abend automatically if it goes out of range on an array, you should use the SSRANGE compiler option.

It is common in replication programming or in COBOL programs that access older files with bit coded fields (life CFO masters in life insurance) to access specific fields with a starting position (colon :) character count.

COMP-1 (single precision floating point, 4 bytes) and COMP-2 (double precision floating point) do not have picture clauses.

The maximum size of an 01 level in COBOL II is 16777215

A good link for COBOL interview questions is here.

Previous experience: end user computing flexibility

From 1979-1981 I worked for a "Combined A&B Medicare Consortium" put together by six or seven Blue Cross/Blue Shield Plans that aimed to put together a state-of-the-art claims and back end processing system to sell to all plans. The project was hosted by Blue Cross and Blue Shield of Texas and was always hosted in the Stemmons Freeway area of Dallas, near Oak Lawn. At the same time, BCBS Texas signed a contract with EDS at the time for Medicare processing of its own, creating quite a tense political situation that eventually helped undermine the project.

The project tried to use the Bryce "Pride-Logik" methodology for specifying all of the system components through various stages of the full system life cycle. I worked on the back end systems, particularly Surveillance and Utilization Review.

Because it was so difficult to get the Plans to agree on their exact reporting requirements, I wanted to develop a method of user-specified options for generating the various reports, particularly for Part B. This was fought as being too vague and too hard to understand.

Yet, perhaps twelve years later, when I was working for USLICO (to become ReliaStar and ING) in Arlington VA, end user specification was part of the philosophy of the salary deduction system (billing and collections), all of which was handled by very well structured COBOL programs (this was even the old COBOL 85). Some of the business was to be put on PC's with Microfocus so that business users would have ultimate control of their business relationships with clients.

So the "philosophy" of computing that I tried to sell in 1981 in a contentious political climate among the non-profit BCBS Plans (and we all know how political BCBS plans are) was becoming everyday design by the early 90s, still in the mainframe world, and would be well embraced by vendors like Dun and Bradstreet MSA, which we used for accounts payables. 4GL's like IE (Information Expert) were being developed then to paramterize everything for easy installation in mainframe environments.

Of course, we all know how Internet thinking took over everything by the late 90s, when everybody was writing GUI's with replication/midtiers or direct connect.

But actually, user-defined computing was pretty well known by the mid to late 1970s. At NBC (National Broadcasting Company), on a Univac 1110 environment, by 1976 we had a General Ledger system from a San Diego company named Infonational, which had heavily paramterized ways of setting up the Chart of Accounts for the various accounting proofs, and which offered a Report Writer (complete with a reconciliation step) that paramterized all of the financial analysis report.

The BCBS environment for the penultimate Medicare system did try to develop a state-of-the-art IMS implementation, which in 1981 was what you had. (Yup -- IMS and CICS, although an alternative for the TP monitor was IMS/DC). They were even going for the ultimate flexibility of "field level sensitivity" in the PCB's, which IBM explains at this link.

Thursday, November 09, 2006

Tech Republic offers white paper on older v younger IT workers

Subscribers to the Tech Republic newsletters can get a PDF file (free) about the cultural conflicts within information technology between older and younger workers. The paper suggests that spot shortages in legacy skills like COBOL are increasing outside of major cities as older programmers leave the market in the wake of the 2002 crash. The paper also suggests that older workers often are better at "connecting the dots" and seeing unwarranted business or legal risks in certain ways of doing things--"know the damage that you can do."

The paper can be accessed here.

The paper is from Forrester Research and it is titled "CIOs: Avoid War Between IT's Twentysomethings and More Mature Workers.: This is not legally driven or concerned with age discrimination laws; it is more concerned with business productivity. However, in my opinion, discrimination laws may have driven the extreme specificity of job requirements for many contracts, especially with state governments for MMIS and welfare department IT contracts. That is because if the laundry list of a position is detailed and specific, and if you fill the position with the closest match (regardless of any characteristics of the job applicant), the theory goes that you are legally safer from any kind of employment litigation. However, in practice some skills are more likely to be held by applicants within a given age range, and this would make one wonder about possible disparate impact claims.

To view the content you will need a Tech Republic subscription.

Monday, November 06, 2006

Review topic: COBOL, PERFORM issues; leap year

I have found notes online that maintain that PERFORM UNTIL is equivalent to PERFORM WITH TEST BEFORE. This has always been the "common sense" undertanding of the UNTIL clause.

Murach, Structured Cobol, 2000, documents PERFORM UNTIL TEST AFTER on p. 197.

CALL statements default to "BY REFERENCE" but "BY VALUE" is used to call subprograms in other languages like C and C++ (usually on Unix systems). It is important when calling by value to make all the lenghts of the fields match, or you can get memory exceptions (S0C4 on the mainframe).

Here is a reference

I found a reference to the issue on IBM mainframes here.

2000 was a leap year, Reference:

Therefore, code to compute a leap year like this should work thru 2099:
DIVIDE MYYEAR BY 4 GIVING MYNUMBER REMAINDER MYLEAP
(all the MY--- are numeric)

However, sometimes, as in actuarial calculations about present value or life expectancy (as in insurance companies) we do have to worry about the year 2100, so you would need to test for centuries (other than 2000) that are not leap years because of the precise length of time the revolution of the earth around the sun takes. In 2100, we could have a "mini Y2K" problem.

Thursday, November 02, 2006

Retrospect of days with DOS assembler

The fourth movement of Brahms's Third Piano Sonata in f minor (Op 5) is a dirge in B-flat minor called "Retrospect." Some of these entries on this blog recreate a few situations in my 30-year mainframe career.

In the mid 1980s, I worked at a credit reporting company in Dallas, and one of the things I did was convert the monthly and daily billing systems from DOS Assembler to OS Assembler. (Then we replaced the Daily Billing with COBOL). Now where OS has DCB's, DOS had DTF's, and the job ran under OS jcl as PGM=DUO with the actual program name an execution parm. That is, DUO emulation. If you wanted to update the files, you ran with the UPSI execution parameter turned on. You had to be careful with dataset nodes, because in the mid 1980s there was no security preventing accidental update of production files.

Shortly after the OS conversion of Monthly Billing (I actually babysat the implementation New Years Day 1986 right after my father had died, before getting on a plane to fly home), we ran into an interesting problem with halfword or fullword boundaries in defined storage in assembler. It seemed that certain instructions did not work properly, and as a result quarterly fixed billing for some members did not work, but we did not find out about that until after the fact, mid month. So we had to rerun a lot of stuff. A programmer had made a change, causing the DS alignment to go off.

The other big technical controversy was addressability. I heard horror tales of Y2K conversions of ALC programs with up to seven base registers! We would use the technique of a temporary base register, with R15 as a pivot point.

In those days, the Bible for Assembler Language programming was George Struble, "Assembler Language Programming: The IBM System 360 and 370, 2nd Edition", from Addison-Wesley, 1969 and 1975. How many programmers learned, say, floating point arithmetic in IBM mainframe assembler? Not many. Today the many client that uses Assembler seems to be the IRS, and usually when they look for people they need extreme proficiency in it, including all of the mathematical instructions.

Wednesday, November 01, 2006

The most common COBOL interview question

The most common technical interview question that I would get over the years was "What's the difference between an ordinary SEARCH and a SEARCH ALL? What precautions are necessary in using them?"

Of course, we all know that a SEARCH ALL is a binary search, and that the items must be in sequence for a binary search (successive splits of the card deck) to work. Binary logic -- and the mathematics of logarithms and exponents -- is what makes search engines today work.

In 1988/1989, I was in a job where the consulting company (which produced hospital Medicare operating margin reports for clients) was charged for its CPU time, and I reduced the CPU time for a particular outlier job by reading a VSAM file sequentially into a table, and using a binary search. The job ran much faster and cost much less. At one point, one day in January 1989, we had, to even stay in business, to rerun a simulation quickly and get correct simulation results after changing some parameters based on the Federal Register. Fortunately, I had made this change and the successful rerun was done in less than an hour.

IDMS: a cautionary tale about the Central Version

I have gotten a call of two from recruiters about some old mainframe experience that I had with IDMS. This was a network database model with Schemas, and had at one time belonged to Cullinane Corporation. It should not be confused with IBM's IMS, which is a hierarchal database model.

In fact, in the 1970s Sperry Univac had owned a similar product, DMS-1100, which had the same kind of structure, and which was intended for its 1100 series computers (1108, 1110), running under Exec8, which was a command-like JCL somewhat like Linux today (simpler than IBM mainframe with its verbosity). In fact, Exec8 offered an automated JCL generator called SSG in the 1970s, which was more flexible than IBM's concept of Procs. While working for Univac, I did a training exercise putting my classical record library on DMS-1100, something easily done today with products like MySQL. In the 1970s, NBC (National Broadcasting Company) was a Univac shop (relatively rare in New York City then) and used DMS-1100 and SSG heavily, as well as "Ascii COBOL".

IDMS could work with either its own format, or with its VSAM transparency, where the data was actually on VSAM files that could be manipulated with standard IBM utilities (IDCAMS). On the salary deduction billing and collection system in the early 1990s, I worked with the VSAM transparency.

Database access could be controlled with either the Central Version (the "CV"), with the inclusion of one specific CV dataset and a specific DD name in a specific STEP in the JCL, or in local mode, in which the actual datasets (in our case, VSAM files) had to be named in the specific STEP (or for the whole job.

When running in local mode, normal security worked. A programmer could not update the production database without specific access. But in the Central Version there was a loophole, because in the 90s at least, Top Secret could see past the CV control dataset to protect update access with the specific VSAM files. There was a risk that with a wrong CV dataset (pointing to the production files) anyone could update billing or collection data with an ordinary submitted batch job, and that the update would not even be noticed for a long time, making recovery very difficult. I don't know if IDMS eventually fixed this.