Saturday, February 07, 2015
Hawaii Health Connector problems pale in comparison to this NYC blunder
by Larry Geller
And we thought Hawaii had a problem with its Health Connector. Not only were we not alone (several states had problems not unlike ours), but check this out:
In a front-page report for the New York Daily News, Democracy Now! co-host Juan González exposes the troubles plaguing New York City’s overhaul of its 911 communications system. The NYC Department of Investigation found the administration of former Mayor Michael Bloomberg mismanaged the upgrade with multiple layers of unaccountable private consultants and vendors, putting the project nearly $1 billion over budget and 10 years behind schedule.
[Democracy Now, Juan González: Overhaul of NYC’s 911 System Woefully Mismanaged & Nearly $1 Billion Over Budget, 2/6/2015] (video)
No, you don’t have to read the rest (unless you really want to).
I’m wondering if there is some threshold that complex programs have passed which brings their, um, do-ability, for want of a better word—into question.
The first computer program I ever wrote was named Minnie, Minimum Cost of Group Relamping. It took me maybe 1/2 hour, in Basic, on a GE Timesharing system. For all I know, its descendants may still be in use today. It automated a calculation telling the user what interval could save money if all the fluorescent lamps in a building were changed at once, instead of incurring the labor costs to call someone every time the lamp above someone’s desk started flickering annoyingly. During my summer job, we did the calculation by hand. It was time-consuming by hand, and took seconds on the computer. In those days, that was remarkable. Being able to run the numbers for a client meant that a salesman could knock off a very big sale (a whole building or warehouse, say) effortlessly. The client could save money. A classic win-win. (If only I had been in sales, though, instead of a kid programmer…).
Also using nothing more sophisticated than Basic, at a customer location (lugging my “portable” Model 33 teletype a few blocks uptown), I assisted a salesperson in pitching to a financial institution. The customer wondered if the computer could identify bad (usually counterfeit) banknotes. The list of bad numbers was in a book, and a clerk would grab a ruler and turn pages and then skim the ruler down the columns to see if a number was there. Since it usually wasn’t, the process was a huge waste of time. On the spot, I wrote a simple program to input a number from the keyboard and look it up in a table of most any size to see if there was a match. Trivial to do. Piece of cake. The customer signed the contract without even reading it. This also led to some programming gigs during the school year.
At night, the computers ran payroll and other “batch” jobs. Most programs were written in Cobol. They all worked well, development costs were reasonable, and there were few, if any surprises. Bugs were quickly eradicated, and upgrades happened quickly. The data was on tape, and the tapes were backed up and the backups kept off-site. Life was so simple then.
(sigh) I guess things have changed a bit. Could Cobol have handled the Hawaii Health Connector program? Probably not. Ok, certainly not. I’m not trying to say it could. But neither did all the sophisticated and complicated systems they tried to use. Are we missing something here?
Large computer projects can be done correctly and on budget. Any large project can be broken into smaller pieces and handled as multiple small projects. Modern programming environments and practices are built around this presumption.
It takes several things to succeed... A well written set of requirements, so that everyone knows what is to be done. Good management and good coordination to keep the various programmers and programming teams working together. And good testing! Lose any one of those things and trouble comes quickly.
Of course you are correct. So I wonder all the more about these failures. Perhaps the NYC project will be audited.
Links to this post: