Wednesday, March 26, 2008
Ever notice the way the paradigm for “job skills” changes?
I spent years as an individual contributor, becoming valuable to a few employers because of intricate, exotic knowledge of the idiosyncrasies of a few applications (whether monthly and daily billing in a credit reporting company, or reading the documentation in Propac and matching it to COBOL code on Medicare reimbursement, or the salary deduction billing and report stacking, or the interfaces between the mainframe and a USPS computer for NCOA processing). The company would depend on me to get through the latest crisis (“end of month’s on fire”, or keeping a skittish client who doubts the accuracy of some tables, or getting through complicated interfaces). Professionally, I would have a unifocal world, and not miss what the rest of the world of work does, although I kept tuned to current events in the broader sense (especially the political issues).
Managers, I thought, were “expendable.” And sometimes, in the 80s and early 90s, with the pyramid flattening and expanded “span of control” that became popular after corporate buyouts and mergers, "middle management" did become layoff fodder. The companies believed they depended on the “super-indians” to keep their shops running. It was important to have those real “hands on” skills.
I have to say this has caught up with me. “Turnabout is fair play,” perhaps. (“Turnabout” was an important classic music budget label for years.) Now, the “skills” are more about integrating a lot of different bits of technical disciplines in order to get my own content ready to sell. How often, however, when there is a problem at my ISP (as with my Urchin stats not running), I find that I feel I know more about how to solve their problem (for example, how automated replication schedules work) than does the tech who answers the phone. If only I had a job there, I could get my own problem fixed! (It’s a tautology: support is support!)
But I also appreciate better, now, why “management” was so often “non-technical.” Right now, in running websites that are content intensive, I find that there just is not time to keep up the intricate hands-on coding skills (in, for example, java) at a level that commercial shops need to require. Early in my career, the pace was slower, and there was “time” to perfect COBOL and JCL, and even then the techniques used (structured programming, whether to do random-access or to sort and process sequentially – usually faster) could get ahead of one’s work and vision. Things just change too quickly. Hence, it’s likely some day I will wind up the shoes of “management,” but hopefully with something that I created.
What do they say: Change is Good!
Monday, March 24, 2008
The recent flap about “curious” contractors at the State Department who peeked at the passport files of the three major presidential candidates reminds us of some serious workplace issues. Huffington’s account of this on March 21 2008 is here. Keep in mind, these were supposedly "good" or "exemplary" employees or contractors who would never actually abuse information. They were just "curious."
Most customer service employees have read and update access to a large list of individuals. In information technology, technical employees usually had read access and update access when specifically requested through formal channels. Some programmers resent the time and trouble to request access and say that they should have it, and that all is necessary is for programmers to be bonded. On the other hand, it is much safer to restrict access. Sometimes database protocols (like the IDMS central version when it is given as a DD in JCL) make it difficult to install security in batch programs. I think there was an issue like this with Information Expert (the other “IE”) in mainframe MSA systems from Dun and Bradstreet in the 1990s.
But this even goes further. At the IRS, customer service employees have been fired for snooping on accounts that were not in their assigned “range” – before the IRS had the ability to restrict mainframe access further by employee. Furthermore, much large scale quality assurance testing in shops is done by making copies of data from production regions, making data available to employees. Associates (and contractors) often work from home, often on computers that they own, allowing the possibility of data intermingling, however unintentional, or may take laptops home, allowing the possibility of theft, which has happened several times in government agencies and major corporations (the most recent being NIH).
When I took a corporate transfer to Minneapolis in 1997, one motive was to have less contact with military personnel related data, since I had just finished an published a book that dealt with, in large part, the "don't ask don't tell" policy regarding gays in the military. I thought it was more appropriate to have as little contact with that group of customers "professionally" as possible, to avoid the "appearance" of "conflict of interest" or the "appearance" of "temptation" for abuse.
There is a lot of “best practices” security management to do in these areas.
Thursday, March 20, 2008
Recreating the wordprocessing environment of the 80s: WhiteRoom, DarkRoom, "distraction free writing"
I do remember my first computer, a Radio Shack TRS-80, a complete console unit for about $3700, that I bought in December 1981. I remember the black and white embedded monitor and the 64 character screen, with white characters on a black oval screen. There was no hard drive, just large floppies.
Rob Pegoraro has a retrospect this morning (March 20, 2008) in The Washington Post Business Section, D1, “Fast Forward: That Green Again,” (link: about a laptop(it seems to have a special operating system, or maybe Linux, or maybe something like the “one laptop per child” initiative) with an application called Whiteroom, that blacks everything on the computer and turns it into a typewriter, with a black screen and light green characters (like the early IBM and compatibles). The software is available from Hogbayroom and offers “distraction free writing.” The Hogbayroom website is not real clear as to system requirements, as far as I could tell. (If someone knows, please comment.) I don’t know whether the freelance writers I met in various Minneapolis forums would. I think it works with wordprocessing, but not with screenwriting software (like FinalDraft). But, remember how writers used to work? (Ad in the 1991 Coen Brothers movie Barton Fink?)
I do recall the advice of the "Writers Digest" crowd in the early 80s, to buy a computer that was specifically designed for writers. How things have changed since then! I remember those times fondly, before the Internet, when it was felt that quick advances in hardcopy desktop publishing could give enterprising writers a competitive advantage -- but there was no efficient way yet to "publish yourself."
There is a competing product for the Windows environment called “Dark Room” at this link.
I don’t think any of these will help me with my big “first novel” (all 42 chapters on my hard drive – and backed up many times – as I write), or my screenplays.
Rob Pegoraro 's tech Q&A column in the Post is here.
Of course, modern computers look like "Darkroom" in safe mode (Dell is white on gray).
Wednesday, March 12, 2008
Tech Republic has a “ten things” page hosted by Jody Gilbert, and today their blog has a column by Justin James in the category “Career Development”: “10 Traits to look for when hiring a programmer.” The blog link is here and there are plenty of message board responses. You may need to be a registered user of Tech Republic to see the content.
Most of the tips are rather obvious. Good “academic” skills are how I would summarize some of it. This is the stuff the “tests” in schools look for (and the awful “teach the test” problem for public school teachers). That is, basic “problem solving ability” in math, represented by those notorious “word problems” or “story problems” in Algebra I. (Isn’t that what a program solves? When you click on a buy link on Amazon, doesn’t the script have to work a “word problem” to compute your invoice?)
Another skill is reading speed and comprehension, out of programming areas. That sounds like it’s right out of grade school, doesn’t it.
Some more of the traits, like passion and “attention to detail” are rather obvious. Back at my first major mainframe job in financial applications back in the middle 1970s at NBC, my office mate used to mutter to himself “attention to detail” as we realized how critical every accounting closing would be.
Obedience (I use my own term here) is an issue, because programmers do tend to be “libertarian individualists” who believe their own ways of doing things follow the principle of Reason Magazine. But you have to obey policies rules (especially security rules regarding elevations and update access to production databases, and now, physical security of consumer data). Even libertarian think tanks like Cato have to have their internal “rules.”
The first trait, I discuss last, in “curiosity.” Back in 1999, when I came back to Arlington and went to work in a local office briefly during a family emergency, a “new soul” but longterm office mate said, “Bill, you have an astonishing lack of curiosity.” It was as if his idea of enterprising character was to download every widget and try it. Well, of course, that can be dangerous, and that can break company rules. (It could even then; only approved software could be downloaded onto personal desktops.) But this person had taught himself to run a Unix or Apache web serve off of a 386 in his own home back around 1993. You see his concept of “curiosity”? Where “curiosity” really comes in handy is in support jobs, where one has to research customer (internal or external) problem tickets, and delve into systems that one did not write and does not know a lot about; often it is difficult to “motivate” an approach based on the documentation that an organization provides. It sounds a bit like “motivating” a proof in a graduate school math course, or even figuring out how to integrate a bizarre function (partial fractions, perhaps) in second year calculus. Some programming training courses emphasize "curiosity", such as a PowerBuilder course where on the first day students were sent into Help in order to figure out how to do things even for the first time.
Perhaps we need a concept like "constructive curiosity" which aligns with "creative business thinking" which goes beyond code and into analysis and understanding of deeper business needs. For example, how to integrate new, market-driven methods of ad delivery in a shared-hosting computing environment.
Monday, March 10, 2008
Recap: did our "culture" of computing really change radically over the decades? Yes, because end-users demanded it.
So how did the nature of the IT job market change over the years? I think someone – me – could write a book about it. Let’s say it all came in overlapping stages.
Remember the days of coding sheets and punched cards? Remember those run decks of “compile, load and go” back in the 60s?
IBM announced its 360 operating systems in 1964, and had pretty widely demonstrated them by 1966. By the late 60s and early 70s, companies were starting to develop their bread-and-butter batch processing and “time sharing” systems, often with much of it in assembler and DOS, which COBOL (and FORTRAN sometimes) gradually replacing it, and with MVS coming to the fore in the 70s. “Data processing” was a mystery then. Companies like EDS would hire and train “systems engineers” and instigate an almost military culture and dress code in order to keep the confidence of customers. (In fact, H. R. Perot started out by hiring mostly military officers: “Eagles don’t flock, you have to hire them one at a time.” I remember seeing Exchange Park when I lived in Dallas, but by then EDS was long gone to Forest Lane. IBM was also notorious for its dress code, even checking for garters.
In the 70s there were other companies competing with IBM for the mainframe market, including Univac and Burroughs. Univac actually had a more programmer-friendly environment, a simpler JCL (Exec 8, much like today’s Unix) and terminal access for programmers. But IBM dominated the job market and gradually drove out the competition. By the mid to late 70s, most major companies employed applications programmers to write, maintain and support their applications systems, which always then relied heavily on overnight and end-of-month batch cycles, requiring on-call support when jobs went down (the “dreaded” SOC7 – or SOC4).
In the 80s, job submission and source management became much more automated and sophisticated, and by the end of the 80s large shops tended to have reasonable security, denying programmer access to production files on a regular basis. (There were always a lot of accidental loopholes, like the IDMS Central Version.) By the late 80s, most employers expected job candidates to have online experience with the main teleprocessing monitor, CICS, first in Macro and then Command Level mode. Most also expected a database, which was usually IMS (hierarchal) or IDMS (network) or even Datacomm (inverted list), or Adabase, to be gradually replaced by DB2. Mainframe legacy applications, often originally in-house written, were being replaced by vendor supplied software (like Vantage for life insurance), and jobs tended to require experience with these packages.
But PC’s had been introduced to home users in the early 80s, and gradually businesses were integrating them into their workflow. A number of applications, including databases (like dBase III+ and dBaseIV) pretended to simulate for users what companies did. In companies, end users wanted more control over their computing, and portions of databases would be offloaded onto desktops with simple DOS applications, sometimes in languages like Microfocus COBOL. Offices had local area networks for documentation. But PC’s were still used mainly to emulate 3270 terminals to get into mainframes, and users from home could dial in with products like Procomm and then PC Anywhere. By the middle 90s, companies were establishing direct connectivity to databases (especially DB2), or replicating data to mid-tiers and using Unix servers and various databases and OOP languages to present GUI’s to users. The culture of online programming with character-based screens would gradually be replaced with very sophisticated end-user applications in object oriented languages, especially java, which would become a production standard in amazingly short time, and end-user applications had amazing “artificial intelligence:.
Mainframe jobs would explode before Y2K as companies would have to retrofit old applications for the Y2K midnight event (that really stretched for some time), but after the 2000-2001 recession mainframe jobs tended to be replaced by contracting gigs that required very specific experience. But many older programmers had difficulty keeping up with the incredibly rapid change in computing style after 2000, partly because of a basic misunderstanding in how it should be learned. Companies really did not replace legacy systems in COBOL with OOP systems; they build incredibly intelligent end-user workstation applications in the new languages, while the legacy applications became more static and tended to demand more specialized expertise to maintain. (Vantage has its own world with its bizarre IO modules and linkedit decks and, as they say, “rules the world” for professionals who work with it.) The job market became difficult (besides economic vicissitudes and offshoring or more conventional jobs) partly because the skillsets became so specialized and fragmented; the professional had to predict what would be hot and what he should become an expert in, and predict accurately.
Saturday, March 08, 2008
This past week, major media outlets presented reports criticizing the practice of “presenteeism”, where employees feel pressured to come to work sick.
The reports emphasized the risk of spreading viral infections (colds, influenza, norovirus, etc) to other workers, some of whom will take infections home to children and elderly. The reports suggested that management should change its attitude on this and send workers home. It was shown as an ethical problem: one cannot be as productive when sick.
In information technology, it’s a mixed bag. Many jobs are held by W-2 contractors, who will not get paid when sick. There is the sensitive possibility that a contractor, in these circumstances, has an “incentive” to come to work and spread illnesses to clients.
Yet the actual “danger” is really mixed. Many infections become subclinical and less disruptive as people get older. As adults, people with normal immune systems gradually develop increased resistance to infections similar to those that they have already had. Therefore, from the point of view of individual “self-interest,” gradual exposure to most everyday “germs” is a good thing, as one will develop resistance.
However, a few infections are novel, and exposure during an incubation period can pose grave dangers to everyone. It's not yet clear if this would really be true of avian influenza (H5N1 or similar) started spreading readily person-to-person, but this is certainly a grave concern. That is why there can be profound ethical problems, and the likelihood of the closing of many businesses should a sudden novel pandemic like H5N1 (or even something else now unknown, possibly an agent deliberately introduced) were to break out.
As for productivity, that depends on the kind of job. Many development jobs in information technology involve working alone at one’s own pace, and productivity is not as affected by mild illnesses. On the other hand, jobs in telephone support, operations or sales require more constant attentiveness and “regimentation,” and productivity is affected more.
Some states and local jurisdictions are considering toughening laws on sick leave, requiring employers with over a minimum number of associate to offer sick leave, and end the practice of bundling sick leave and vacation (which can reward “working sick”). A few states (California, Washington, New Jersey) have passed or are considering laws that mandate even some paid family leave. These laws could have an effect on how information technology contracting works. It’s possible that in some jurisdictions, W-2 contracts with no benefits could become illegal. That would force them to offer only corp-to-corp. That could change the culture of IT consulting, and place even more emphasis on the public online reputations of individual consultants, and make potential conflicts of interest any even more critical issue. Again, unintended consequences, perhaps.
Thursday, March 06, 2008
E-Week has a very interesting "slide show" "The 10 Most Wanted IT skills". The link is here. The website will play the presentation as a "slide show", a lot like an old-fashioned grade school filmstrip, or like a Powerpoint business presentation.
The mix of skills is interesting. They start out with hardware and security, move on to soft skills ("people skills" and project management) and finally to programming languages. Web technologies and mobile come out as 9 and 10. It's pretty interesting, in a down market again.
Right after the 2001 downturn, in January 2002, Jim Thompson had an article in Computer User, "Hot Careers in a Cool Market," with a "what's hot and what's not" dot-point list comparison. COBOL was on the "what's not." (That may have changed somewhat now.) Microsoft Visual Studio .NET was near the top of the "what's hot." I doubt that's true now.
Wednesday, March 05, 2008
Business travel – it was an experience that I relished when I started “working” in 1970: the pre-security airline travel, the hotels, the rental cars (back in those days, rental car companies really did charge by the mile during the week, but “the company paid for it,” even on job interviews). It seemed like an adventure, running around New Jersey then on someone else’s dime. The first “business trip” to another part of the country, in fact, would occur in April 1970 with RCA with a day in Indianapolis. I would later have a 12-week assignment in Indianapolis (starting right after Kent State in Ohio), and be glad to be able to come back “home” to the Easy Coast. Then, the Midwest seemed “dumb.”
The travel would occur again when I worked for Sperry Univac, particularly with eleven weeks in Minneapolis St. Paul (especially Eagan, at the sprawling facility on Pilot Knob Road, still belonging to Unisys) on benchmarks. There were lots of all-nighters with sitting in computer rooms with card decks and panic dumps. That was the style of work then. This was the winter and early spring of 1974, during the worst of the Arab oil embargo gasoline shortage, but I had my own rental car, and it did not affect things much in Minnesota. I was glad to get “home” again, but the great irony was that I would move to Minneapolis in 1997 (with Reliastar aka ING) and spend, in some ways, some of the most interesting six years of my life. A downtown lifestyle in the Churchill Apartments, walking to work on the Skyway, with “social life” much of it a few blocks away (yes, Hennepin). The winters were not nearly as severe as they used to be (global warming?)
There was some travel around the country in the early 1980s with the Blue Cross and Blue Shield Combined Medicare Project, headquarter out of Dallas. But that fell apart (because of the political fighting among the plans), and I had to move on to Chilton, a credit reporting company (now Experian after a couple of corporate moves). The data processing center was in a low rise, curved yellow brick building on the fringe of Oak Lawn (the “gay” part of Dallas), on Fitzhugh. Despite conservative Texas and the Reaganism, this was a place where you went to work, did your job as an individual contributor, supported your own work when it went into production, and lived your own life. Dress, even in 1981, was casual. (Even though culturally influential EDS was a few miles away on Forest Lane, heading for Plano.) I liked things that way. But the travel stopped, and life became really personal, and priorities changed.
But the little consulting company back in DC that I would start at in 1988 brought me back to the road, a little bit, back and forth to Richmond. We had been hosted at Healthnet (Blue Cross and Blue Shield) on Staples Mill Road, and also had used “The Computing Company,” (TCC) a Richmond contractor that in those days did Virginia MMIS. And there was one weekend in May 1989 that I recall particularly well, when the little company had been sold to what is now a significant healthcare consulting company, the Lewin Group (then it was Lewin/ICF). My job for the weekend was to go down to Richmond, back up all the data onto open reel tapes (with simple IEBGENER ‘s or SYNCSORT’s), catalogue them, and bring them back to DC in boxes, and leave them secured in a building in DC on Sunday afternoon. The physical safety of the little company’s work was in my hands. And it was in the trunk of a Ford Escort (mine) Sunday afternoon as I drove back to DC and watch the miles go back down from 100 to 0 along I-95. It’s interesting that something so critical (the future of a whole company) could depend on doing something so simple (and mundane) as multistep JCL; but you had to be careful to code the parameters for multi-file tapes correctly. I saved printouts of the JCL and put them under my pillow in the motel room that night; it was that critical.
That whole experience leads in to what I do today with my own “stuff”, keeping track of backups (diskettes, hidden sites, etc), security for computers, everything. It also reminds me of a now modern problem for businesses everywhere, with the proliferation of laptops, removable hard drives, and telecommuting, is the physical security of private consumer data, which I transported legally in my own vehicle in 1989. Yes, I was paid for my extra time that weekend.