Saturday, September 29, 2007

Corporate sites with javascript and database lookups for consumers: a tip


I’ve noticed that some companies that offer financial or retail services to customers online, when they develop web pages in with javascript, sometimes place hard-coded text content (embedded in javascript functions “newContent” parameters sometimes) that is viewable in browsers under “view source” but that is probably not appropriate for all consumers to see. Sometimes they place all hardcoded content for all possible consumers on one page and really may not want all consumers to see it. It would be more appropriate for this text itself to come from a database and be viewable only to the appropriate visitor or consume. No, I won’t mention any names or misuse any information; I just wanted to pass this on as a programming issue.

Typically most of the pages will do a database call(s) (SQL) to find the information that the consumer requested. Sometimes the database calls are to image index files (images of mainframe documents) and get errors (often security or access-level related), leading to default error messages that are incorrect or misleading. Under “view source” in a browser (IE or Mozilla) the visitor can see his own information from the database and all of the javascript code, include hard-coded text. Companies may really not want visitors to be able to see all of this.

Friday, September 21, 2007

History of Computing Culture 103: Bradford and New York State MMIS (finally on IBM), 1977


What followed NBC was a migration to IBM. Since there were only six or seven Univac installations in New York City in the mid to late 1970s, if someone had a mainframe IBM background he or she could become much more marketable. So I managed to get an interview with Bradford National Corporation when it had to staff up suddenly after getting a contract for New York State Medicaid Management Information System (in 1977). On May 31, 1977 I started there at 100 Church Street in lower Manhattan (a building that would be slightly affected on 9/11). I remember riding down from the headquarters at 1700 Broadway and being told that we were “consultants.”

Bradford National Corporation would eventually be bought by McDonnell-Douglas in the 1980s.

In those days, you wrote program specs in handwriting and gave them to a typing pool. We had a terminal row or “tube city” and used Roscoe procs to compile programs. I worked on the back end, or MARS (“Management and Administrative Reporting”). The system consisted of an extract from the claims detail, sorts of the extracts in various sequences or “legs” and then the reports. New York State auditors came down to analyze the system tests, with the most sensitive reports being those on nursing homes, since SNF’s had more federal reimbursement that (custodial) ICF’s. All the files were tape, and the end of month reports with 1978 technology took extremely long to run. But the operating system was already MVS, with all programming in COBOL.

I had nineteen months MMIS experience. In 2002 and 2003 recruiters started calling programmers with MMIS experience, but most jobs required two-five years MMIS experience. It must have changed a lot since then.

Wednesday, September 19, 2007

History of Computing Culture 102: NBC (with the RCA Spectra and Univac 1110, in the 1970s)



Continuons-nous! I got a call from a director at NBC, who had moved up there from the RCA operations research career program (Sunday post), once my resume was on the loose. On Monday, August 12, 1974, three days after Nixon’s resignation, I started as a programmer-analyst. We worked on the 14th floor of a satellite wing on 6th Ave and 49th Street in the (now GE) RCA Building (there was no 13), with the Univac 1110 and RCA Spectra on the 8th floor.

I even remember Gerald Ford speaking to the nation that night. “I am a Ford, not a model T.” But during the first week of September (after a Labor Day weekend in Mexico City to “celebrate”) I moved into the Cast Iron Building on 11th and Broadway to start a new life. I sold the car to a Univac employee.

The pace was slower in those days. The project was to implement a new general ledger system. One had to read the transaction tapes on the Spectra 70 and convert them to be readable on the Univac 1110. For the general ledger we purchased a general ledger system from Infonational and converted it to Univac Ascii COBOL, which did not cause significant problems.

The Spectra part was the first COBOL program that I ever designed. This was done all with punched cards. This was also in the days before structured programming, go-to-less programming, self-documenting code, top-down testing, etc. were the expected norms. So aesthetically my first programs on that machine were ugly to look at. But once implemented, they ran perfectly every accounting closing. The needed to, because fixing them on the fly would have been unthinkable with an old computer.

Working with the purchased COBOL programs on the Univac was much easier. We had teletype terminals, that did not have a CRT display, but that had a paper tape that kept a record of what you typed and of the system’s responses. We had semi-private offices, with two people per office. There was a rule against “compiling in demand” during normal business hours, but you could schedule a batch job to compile and link and usually it ran right away. Exec 8 was very convenient, much less verbose that IBM DOS or OS JCL, which I would encounter later in my career. It also had an automatic jobstream generator, SSG, which IBM didn’t replicate until JES2 and JES3.

Accounting cycles consist usually of daily or weekly voucher registers, and proofs, including a final proof for end of month. Each proof was printed in carbons and was comprised many stacks of greenbar computer paper, that was separated and given to users. Accountants make adjusting entries to the proofs. There is also a chart of accounts, which is maintained, in those days with batch jobs before the cycle. End of month could be a bear because of the huge detail sort in the last voucher register. In those days, it could take a Univac 1110 three or four hours to sort 300,000 records or so. I learned what it was like to be “on call” for my own applications. By the mid 1980s, an Amdahl or IBM mainframe could to the same in a few minutes.

The mechanics of how we worked deserve note. The paper tape came in handy. These were days long before sophisticated system security and “separation of functions” according to the job. Programmers had full update access to production files. We often set up test files as copies of production ones. (That is not acceptable today in many shops because of consumer privacy, but this was decades before modern security and privacy concerns hit the media.) If a programmer inadvertently reversed the order of file qualifiers in a copy (GL and XGL, for example), a production file could be overwritten and it would not be noticed until after the closing was run. So we kept the hardcopy terminal tapes of exactly what we did; that was the “security.” By the late 1970s, however, companies were learning that it would pay to install security systems and safer ways of working.

This was a job. In time, I came to understand the virtue of good coding practices as we now know them. Generally, I did not think much about the “glamour” of the media. Television studio tours were available (you didn’t dare visit them during working hours, or you could get fired.) The one exception was when we were invited to work on soap opera sets for a few weeks in the spring of 1976 during the NABET strike. (link May 27). That was an interesting taste of the “real world.”

I certainly wonder how the information technology environment must have changed, several times over, since the 1970s, with the GE and Universal mergers, and the new generations of web technology and monumental changes in the legal and reporting environments.

Tuesday, September 18, 2007

History of Computing Culture 101: non-IBM, without a "marketing profile"


Carrying on the History of Computing Culture 101 that I started and presented on Sunday, I venture further into the subject of non-IBM mainframes in olden times – and especially of trying to sell them.

In the early 70s, besides IBM, the other players were Univac, Burroughs, NCR, RCA Spectra, Honeywell, Data General/VAX, and General Electric. In time, they would drop out or merge and various brands would get eliminated. Univac was probably the largest competitor; Sperry Rand, owning Univac, was a large conglomerate with a major skyscraper near Rockefeller Center in New York. It might have outpaced IBM given how things looked in the 50s; IBM, though, turned out to be the better marketer. It had an efficient, easy to learn and code JCL called “Exec 8” with simple commands that are a bit like today’s Unix. Univac sold three large mainframes with its proprietary architecture (1106, 1108, 1110), and an “minicomputer” imitation of the 360 called the 9000 series,

In the spring and summer of 1972, a couple of friends at NAVCOSSACT left and went to work for Univac as instructors in their education center in Tyson’s Corner, VA. I almost did that. I had an interview with Univac at Bell Labs (a revisit) and I remember a bizarre question from the Univac interviewer, “Do you like programming?” Then, I was 29 and wanted more adventure—my friends had it. On Aug. 23, 1972 I got a sudden call at home from a Univac branch manager in New Jersey. I went up and interviewed the Montclair Branch on Aug. 30 and started a “new life” on September 25.

I was officially a “Systems Analyst” and the job was to support sales teams at client sites. I was assigned to Public Service Electric and Gas in downtown Newark, which gave easy access to New York City. My personal life (other blogs) was “changing” but I had a convenient garden apartment in Caldwell, with pretty efficient bus service. There were five staff members assigned to the account, and I was the “processor support person” for FORTRAN and COBOL. At the time, there was still a lot of FORTRAN. In the “Management by Objectives” jargon of the time (now the buzzword is “Total Quality Management” and “Team Handbook”) the goal of the team was to get an 1106 machine on rent by some certain date. A couple of staff members spent all their time analyzing panic dumps (and installing fixes with SYSGENs) from system crashes, which did happen then. (Dumps in Univac were in Octal, not Hex; the most common character sets were Fieldata and Ascii.) Essentially they were what we call today “systems programmers.” The following spring, we had benchmarks of an 1110 at the test facility in Eagan, Minnesota, a suburb of Minneapolis-St. Paul, just off the 494 “strip” (where the Mall of America is now).

Univac tended then to be ahead of IBM in programmer online access; most programmers at PSEG had their own terminals, some of them teletype, a few cathode ray. It also made sophisticated keypunch equipment; in fact, the third floor of the Montclair branch office (where I had a little used desk) was a major center for keypunch distribution and sales. While at PSEG I wrote an assembler program to real the log tapes and monitor how much use each programmer made of various facilities, called BIGBR, or “Big Brother.” That wasn’t that big a deal there, but in those days computer time and use was expensive, and in some companies programmers could be penalized for needing too many “shots” to get a program working. (This was particularly true overseas.)

After the benchmarks, the Branch manager came to the conclusion that I did not have a “marketing profile” (what does that mean behind the lines?) and ought to transfer. (On Oct 2 in this blog, I had talked about, “Can Techies Sell?”) Dress at Univac was not the big deal that it was at IBM; my first day on the job I war a chartreuse colored suit, and other reps had lively, sometimes flamboyant suits that would not have met the more conservative standards at IBM and certainly EDS at the time. Even the salesmen were a bit showy. (EDS, in a memo that I saw once, claimed that the dress code was intended to gain the confidence of customers who did not understand computers.) Rumor had it that companies like that told you what kind of car they expected you to drive. (I had a Pinto.)

I did get to take a two-week COBOL course at Tyson’s, from one of the friends who had left before I did. That was my first introduction to what would become the mainframe procedural language for business applications for three decades. It wasn’t apparent how important COBOL would get until the early 70s, after which so many financial, manufacturing, and retail companies would write their own inhouse applications, before the large software vendors grew.

I was assigned to a smaller account, Axicom or Transport Data, for a while, before interviewing the Bell Labs account and getting transferred to the AT&T account in Pascataway, NJ, farther away from the City and less convenient (although close to the Metro Park commuter station and on the “Blue Star” route). I got an apartment near Bound Brook, near the Raritan River, which has flooded twice after I was long gone.

Pretty soon, I was invited to travel repeatedly to St Paul for another 1110 benchmark, the object of which was to process the magic “1150 transactions per hour” on a new 1110. The Bell Labs programmers had written complicated simulations of the transactions that had to work, with lots of DMS 1100 calls. That database followed the network model also used by IDMS on the IBM mainframe, with a DDL and schema and location modes of CALC and SET. I trained myself by writing a little DMS-1100 application for my classical record library. Now, you ask, isn’t that computer use for personal business? Yes it is, but in those days it was Okay if there was a legitimate learning purpose. Security and misuse (despite the expense of disk space and computer time) was not the big concern then, even on client computers.

We ran the transactions from punched card decks, and at the time keeping the decks organized and ready was part of the job. We usually had computer time from 4 to 12, but as the final demos approached I certainly remember the all nighters and the exhaustion. I was well into adult life. One of the biggest technical problems was the DMS-1100 "rollbacks" caused by "deadly embrace" or "Catch 22" deadlocks, which were finally resolved by processing database updates in parallel transactions in the same sequence.

After the benchmarks, I was assigned to the AT&T account, and spent a lot of time in lower Manhattan, and some in Westchester county. It’s hard to get anywhere just troubleshooting and supporting customer’s applications, unless one moves into marketing. It was apparent that I should code my own applications again, and I wanted to move into the City. That started the next chapter of my career.

Monday, September 17, 2007

Remembering Y2K: When have you tested everything? What data do you keep? What data covers everything?


Remember Y2K? Back in 1999, we had a big workplace debate just on what, from a philosophy 201 perspective, constituted a satisfactory data repository of evidence that all of our systems (in a life and annuity company) would perform properly on and after Saturday Jan. 1, 2000 (and for that matter Mon. Jan. 1 2001), since it was necessary to expand the year to a four-digit position. (It’s a bit more involved than that with some systems, but that was the idea.) There were several questions: what jobs should be run? Which cycles should be run? (End of month? End of year?) What printouts or files should be saved? (File-file compares? Reports? Test data?) What production data should be extracted? How would it be collected and stored? In the fall of 1999, we did wind up boxing a lot of JCL, reports and screen prints and shipping them to an official warehouse. Y2K came and went without a hitch.

We had a similar exercise early in the year with a disaster recovery fire drill (at a company called Comdisco) that I remember well. What data, what files do you collect, what do you run on the backup site to prove it all got copied.

Back in the early 1990s we had philosophical discussions of this sort. One had to make sure that all possible situations were covered by test cases or by extracted production data or selected production cycles (now a bigger issue than then because of privacy considerations). Before any elevation, there would be the exercise of parallel cycles, file-to-file compares, and saving evidence, in the form of printouts, screen prints, and sometimes just on disk or offloaded to diskettes (maybe copied to the “LAN” which then was a real innovation). Because of “personal responsibility” I kept quite a library of test runs in the big black three-ring binders, low-tech. This way it was possible to prove that the system was tested properly if something ever went wrong. That may sound like lack of confidence.

There was also the issue, emerging then, that source management software (then CA-Librarian, today usually CA-Endeavor) had to be used properly to guarantee source-load module integrity.

In fact, as far back as early 1989, I essentially “saved” a small health care consulting business (small then, big now) by saving a huge paper library of test runs. I spent three weeks desk checking numbers in a windowless office with no personal PC terminal. When a major client questioned our numbers, I was able to prove we had run everything properly. Re-examination of Federal register specs and of COBOL code from a federal program showed a discrepancy within the government’s own work. When I replicated federal code in our system, we quickly got the results that the client had expected after running the model and simulations.

There is a lesson in all of this. Remember that undergraduate Philosophy 101 course where the professor asks “how do you know what you believe?” or something like that. Remember those essay questions on epistemology? (I got a B on that.) Systems testing and quality assurance is all about that, when a system must run in production and process millions of client transactions daily, perfectly. It’s volume, buddy, along with absolute perfection. That’s what mainframe culture was all about.

It seems that one can blow this kind of question up when we look at major issues today. How do we know that we have collected all of the relevant data or cases and that it is right?

Sunday, September 16, 2007

Army-Navy and in between: RCA Spectra and Univac: mainframe history


Theoretically, my two years of Army service, 1968-1970, where I “volunteered for the draft” for “enlisted for two years” and wound up with an RA number (RA11937256) and took that 95% chance of winding up in infantry, the queen of battle. Well, my MOS out of Basic (even given a few weeks of STC – Special Training Company) was “01E20” – Mathematician. I spent the summer of 1968 – three months – in the Pentagon, and, after a mysterious transfer, the rest of my hitch at Fort Eustis, VA (“Fort Useless”) with the Combat Development Command Transportation Agency (USACDCTA) in that “white building” that no longer stands.

There, in theory, should have provided computer experience. That was minimal to say the least. At the Pentagon, we coded sheets classifying units as Combat, Combat Support (Engineers), and Combat Service Support (CSS). At Fort Eustis, we coded a library cataloguing system (on coding sheets) called SPIRAL. That kept me out of the rice paddies, a morally controversial ploy in its day. It got a chance to study and read about a simulation package called SIMSCRIPT.

In the middle of 1969 I started researching what my first job would be. Companies would respond with form letters in fancy letterheads, but some of them bit. I got flown to an interview with Rand in California (Rand would write the million dollar unheard proposal in 1993 on how to lift the ban on gays in the military) because of my Simscript background. I was flown to Syracuse in December to interview GE Heavy Military Equipment, to New Jersey for Bell Labs and for RCA Labs. In those days, companies paid the interviewing expenses for people with graduate degrees (I had the MA in math from the University of Kansas).

Rand and GE lost some budget with Nixon’s cutbacks, already taking hold. But Bell Labs and RCA came through with offers. With both interviews, I had to give technical talks on my Master’s Thesis (“Minimax Rational Function Approximation”). I wound up taking the RCA offer, the Operations Research Training Program at David Sarnoff Research Center in Princeton NJ, near the Princeton Junction station, on Route 571, a few miles from the University. (I understand that this Center now belongs to SAIC.) I lived in an apartment in what was then called Cranbury and is now called East Windsor. (RCA also had an MIS training program, where programmers roomed in a motel while being trained for ten weeks in COBOL and assembler, something that conjures up ideas of how EDS used to train its systems engineers during that era).

Operations Research conjures up ideas of linear programming and optimization. It does include these. However, at RCA, the program consisted of a few “assignments” at various RCA locations. After three months at the labs, I was sent to Indianapolis, to a television manufacturing plant. I was supposed to complete a dynamic programming model to optimize production lines. The model was written in Fortran and to be run from punched cards on an RCA Spectra 70. At the time, Spectra was pretty much a clone of IBM. It had the same assembler and languages. The system was totally inadequate for processing the algorithm. Today, there would probably be nothing to it and I suspect that there are dynamic programming algorithms to solve this kind of problem in java libraries.

I also worked on a manpower allocation model at Cherry Hill, NJ. We would work on TTY terminals (with paper roll output, no CRT) and diddle around with the data.

This did not result in an offer. RCA television sales and other sales dropped off in 1970 and I was laid off in February 1971. That was my only layoff until December 2001 (thirty more years). Luckily, I knew someone in the Navy department, and that would lead me to Univac. So I went back to work for the military as a civilian, working on Fortran simulations on a Univac 1108, with “Exec 8” which was a command-like JCL that resembles Unix or Linux, and the Naval Command Systems Support Activity, NAVCOSSACT, in the Washington Navy Yard, now unrecognizable with all of the development. I used to park on Water Street, not too far from the new Nats Stadium. .

Thursday, September 13, 2007

My career began on the IBM 7090 (in 1965)


It seems as if I stumbled into information technology as a career as a safer choice than music and piano, especially in a Cold War world with a draft. I actually got in on the government’s dime. Although my first formal job was at the National Bureau of Standards (at Connecticut Ave and Van Ness Streets in DC – now the site of the University of the District of Columbia but at the time, a brick building campus that would become Federal City College -- complete with underground tunnels) from 1963-1964 as a GS-4, as a chemistry laboratory assistant (rheology, measuring viscosity of standards oils) – the first job to launch me somewhere was at the David Taylor Model Basin, now known as the Naval Ship Warfare Center, in Carderock, MD, right where the I-495 Beltway crosses the Potomac. (The notorious Beltway was already there then.)

The job comprised Fortran programming on the IBM 7090, a predecessor to the 360 architecture. We spent every morning in training in the 1965 summer – a good deal to get a college level course on the government’s dime while being paid. The last summer (1967) I was a GS-7 because I had my BS from GWU. At the time, defense was all the rage, and the projects had to do with underwater buckling pressures. (A bit of positive karma or foreshadowing for my 1993 Norfolk “civilian” submarine visit that would be discussed in my book). We got to walk through the wind tunnels. We got a field trip downtown to see the “new” 360 that first summer. We submitted decks to compile and execute and had little reason to learn the “JCL.” There was also an assembler language called SAL and SLA, much simpler than mainframe MVS assembler that IBM would develop.

During the 1966-1967 academic year in graduate school at the University of Kansas in Lawrence, I worked on similar projects as a research assistant for physics professors, on a General Electric, with Fortran, and the technique was pretty much the same. At least I learned that there were other mainframe companies “trying” to compete IBM, a situation that would become more and more important as my “career” evolved. I think that the last movement of the Shostakovich 13th Symphony is called “A Career.”

In those days, we coded on sheets with columns, and turned them in for keypunching. Or we keypunched ourselves, and got good at it quickly.

In two years, man would set foot on the Moon. What at time.

Saturday, September 08, 2007

An old lesson on the risks of uploading anything even to "private space" (with 1981"mainframe" technology)


In 1979 I left New York City after four and a half very interesting years living in Greenwich Village, for Dallas, working for the Combined A&B Medicare Consortium (“CABCO”) of (then) six Blue Cross and Blue Shield plans around the country. “The Project” was supposed to develop a state-of-the-art Medicare claims processing and reporting system, competing with EDS (a circumstance that created an immediate conflict for the host Texas plan). That would start an interesting 9-1/2 year period in Dallas, some of the personal aspects of which I have discussed on other blogs.

I written in some detail about what happened there, especially on Nov. 13, 2006 (“End User Computing Flexibility”), March 27, and July 16. We used a system development system from M. Bryce Associates called “Pride-Logik” that tended to control the development process and run the show. The project manager had a sign in front of his office in the Zale Building (in 1980 it moved up Stemmons a bit) “Abandon All Ye Who Enter Here.” He had a certain style of Zen management that did not reign on the political and turf battles of the six plans. That led to the failure of the project after three years of running around in circles. This was harmful, as I spent close to three years with some good design experience (IMS and CICS, the staples of the time), but we never implemented. We did hire “programmers” near the end who coded a few reporting modules under my supervision – that was another thing, we had a division between “analysts” and “programmers” which may seem passé today – except that today analysts would be called “architects” and would often be coordinating work done by programmers overseas. Had the project succeeded, I did intend to “move up” because IMS and CICS was considered a competitive production work environment by the job market standards of the day.

There was one particular incident in June, 1981 (I left in October, seeing “the handwriting on the wall” – the project folded at the end of January 1982) that teaches a lesson oddly relevant to today. We had been writing pseudocode (as part of
“Phase 4”) and had gotten a small inhouse mainframe (a 3330 I think – at the time a big deal). We had a crude TSO system (I don’t think we had ISPF, and certainly not Roscoe) and I had saved some bits of pseudocode on a dataset for reuse in various specifications. One day in June a “librarian” pulled off all of our work and gave it to others to “review.” I had not intended that dataset to be “public” and was shocked when people thought I had handed in “junk” work. It was supposed to be a private work dataset, much like a private word document on a home PC (and not published to the Internet). Or perhaps it was analogous to an element on a social networking profile not published to the world but whitelisted to a known list, but that still “gets out.” In those days, people worked with paper and pencil a lot (we hand-drew system flowcharts and structure charts, long before the days of Visio), and logging on to “the system” implied a bit of a commitment. But actually, the MMIS system at Bradford in NYC where I had worked 1977-1978 had been more advanced, with Roscoe and individual office rooms, but still a “tube city.”