December 19, 2006

Perhaps They Should Have Tested More - Glasgow's e-Formulary IT System

Because of a software bug, patients received Viagra instead of Zyban - and nobody complained! Go figure!

Computer glitch prescribes Viagra to stop smoking

By Tom Sanders  19 December 2006 11:19AM  General News

Smokers trying to quit report unusual side-effect.

A software bug in Glasgow's e-Formulary IT system has been blamed for replacing prescriptions for the Zyban anti-smoking medication with the erectile dysfunction medicine Viagra.

Doctors who tried to select the smoking pill instead ended up printing prescriptions for sildenafil, the generic name for Viagra. The National Heath Service Greater Glasgow has sent out a warning to the family doctors and surgeries in the area.

No patients have complained about receiving the wrong medications, a spokesperson for the heath authority told The Times.

The glitch has been traced back to an update of the General Practice Administration System for Scotland. The problem lasted about six weeks before it was noticed and will take an estimated four weeks before it is repaired.

The health risks of taking Viagra are limited, as the medication has no serious side-effects.

December 16, 2006

Perhaps They Should Have Tested More - Sequoia Voting Systems

An interesting combination of "How Well Did e-Voting Work This Time" and "Perhaps They Should Have Tested More", sent to me by my friend Daphne.

Report blames Denver election woes on flawed software
Todd Weiss

December 13, 2006 (Computerworld) Poor software design, serious IT management inefficiencies and an untested deployment of a critical application were all major factors in last month's Election Day problems in Denver, according to a scathing report from an IT consultant. The problems led to hours-long delays for voters looking to cast ballots and raised questions about the overall efficacy of e-voting.

The 32-page report, released Monday, concluded that the main reason for problems was the electronic poll book (ePollBook) software used by the independent Denver Election Commission (DEC) to oversee voting. The e-poll book software -- an $85,000 custom application created by Oakland, Calif.-based Sequoia Voting Systems Inc. -- included the names, addresses and other information for all registered voters in Denver.

Sequoia was already a voting services vendor to the city and county, and the application was designed to allow poll workers across the Denver area to check off voters as they came in to vote at newly created voting centers. Denver has moved from the old precinct-style polling places to a new "voting center" model where voters can go to any polling place in the area to cast ballots, regardless of where they live. The software was supposed to make it easy for officials at any voting center to check online and make sure a voter had not already voted somewhere else in Denver.
Instead, it led to massive problems on Election Day due to "decidedly subprofessional architecture and construction," according to the report from consultants Fred Hessler and Matt Smith at Fujitsu Consulting in Greenwood Village, Colo. Fujitsu was hired by Denver shortly after the election to find out what went wrong and help to fix the problems.
"The ePollBook is a poorly designed and fundamentally flawed application that demonstrates little familiarity with basic tenets of Web development," the report stated. "Due to unnecessary and progressive consumption of system resources, the application's performance will gradually degrade in a limited-use environment and will be immediately and noticeably hampered with a high number of concurrent users."

In other words, the more heavily it was used, the slower it worked.

"Moreover, it appears that this application was never stress-tested by the DEC or Sequoia," other than using it in the spring primary as a test election, the report said. "It is at best naive to deploy enterprise software in an untested state. It is remarkably poor practice to deliberately choose a critical production event (the primary election) to serve as a test cycle."

The Sequoia application was chosen over a tested ePollBook application already in use by Larimer County, Colo., that has been offered to other Colorado counties for free. The consultants recommend that the DEC either get the Sequoia application repaired or take a new look at the Larimer software to see whether it could be used effectively in Denver. The Larimer application uses a server-resident Microsoft Access front-end accessed via Citrix and an Oracle database on a dedicated server, as well as five application servers for access by election officials.

The voting center delays -- with waits in some places of up to three hours -- forced an estimated 20,000 voters to abandon their efforts to vote on Election Day, according to the report.

Other problems with the software include Web sessions that would not expire unless a user clicked a specific "exit" button to close the application, tying up system resources, according to the report. The problem, gleaned from user activity logs generated during the Nov. 7 election, was that 90% of user sessions that day were not ended using the special button but were closed by users who simply shut the browser. That did not free up resources, causing the system slowdowns.

"In media reports following the election, Sequoia defended this flaw by stating that the DEC had not requested that a session-timeout feature be implemented," the consultants wrote. "This is a weak and puzzling defense. In any case, session management is a fundamental responsibility that developers of Web applications are expected to fulfill. Describing session management as a special feature that must be requested by the client is not a reasonable position to adopt."

Also troubling, the consultants said, is that the application and database currently share a server instead of relying on a dedicated database server -- something that would have improved performance, security and redundancy.

A spokeswoman for Sequoia, Michelle M. Shafer, declined to comment directly on the consultant's report in an e-mail response. "While we may disagree with opinions expressed by the author of this report, our focus is on helping Denver solve their problems," she wrote.
In addition to the software problems, the report stated, IT management within the DEC needs to change so that similar situations don't occur again.

The three key flaws within the DEC are "generally substandard information technology operations and management," "dysfunctional communications between the technology function and other leadership," and "a general and pervasive insufficiency of oversight, due diligence, and quality assurance," according to the report.

These issues also led to problems with absentee ballots that couldn't be easily scanned by poll workers and other difficulties with equipment, poll workers and other systems, said the report. "The less-than-rigorous conduct of the ePollBook development project and the ultimate failure of [it] on Election Day, along with ... the absentee ballot scanning problem, should be viewed in a broader context of substandard technology management within the DEC," the report said. "Given the increasing criticality of technology in conducting elections and the sensitivity of personal data in the DEC's possession, this casual approach to technology cannot be permitted to continue."

Alton Dillard, a spokesman for the DEC, said the commission "agrees with 99% of the report" and will take actions to resolve the problems. "The ePollBook was the chokepoint, but there are some other things that need to be addressed," he said.

The DEC meets Dec. 19 to decide how to handle next year's spring primary and off-year fall elections. Three options are under consideration, Dillard said, including the use of mailed ballots for all voters, a return to precinct voting or continuing to use voting centers while fixing or replacing the ePollBook software. Officials want to get everything fixed before the 2008 presidential election, he said.

"Right now, there's no uniformity among the [election] commissioners on which form to accept," Dillard said.

Chris Henderson, the chief operating officer for the city of Denver and a spokesman for Mayor John Hickenlooper, said the consultant's report shows that "clearly the ... technology component of the election commission is pretty broken right now. We are dismayed on a lot of levels about the troubled nature of the implementation of the [ePollBook] software. The challenge is the election commission's business to sort out those questions."

Henderson said he hopes the DEC looks seriously at the consultants' other recommendations, including a call for the DEC to take advantage of the IT staff and resources used by the city and county. "I think, clearly, there's an opportunity for them to benefit from some of the smart people we have working for the city of Denver," he said.

On a related note, John Gaydeski, the executive director of the DEC, resigned from his post last week in response to the problems stemming from the November election.

December 13, 2006

Zarro Boogs Found

Those of you who use Bugzilla have no doubt encountered the phrase:

Zarro Boogs found.

Here's the "official" explanation of that phrase from the Bugzilla Glossary at

Zarro Boogs Found

This is just a goofy way of saying that there were no bugs found matching your query. When asked to explain this message, Terry had the following to say:

I've been asked to explain this ... way back when, when Netscape released version 4.0 of its browser, we had a release party. Naturally, there had been a big push to try and fix every known bug before the release. Naturally, that hadn't actually happened. (This is not unique to Netscape or to 4.0; the same thing has happened with every software project I've ever seen.) Anyway, at the release party, T-shirts were handed out that said something like "Netscape 4.0: Zarro Boogs". Just like the software, the T-shirt had no known bugs. Uh-huh.

So, when you query for a list of bugs, and it gets no results, you can think of this as a friendly reminder. Of *course* there are bugs matching your query, they just aren't in the bugsystem yet...

--Terry Weissman

December 11, 2006

Perhaps They Should Have Tested More - NASA

Apparently the satellite control software was off by 45 degrees:
"Anybody that has ever taken algebra has gotten a problem wrong because you slipped a minus sign somewhere"
So NASA isn't able to get their Algebra right? That can't be a good sign.

December 11, 2006

Software glitch spoils inaugural launch from Va. spaceport

Associated Press Writer

ATLANTIC, Va. - The inaugural rocket launch from the mid-Atlantic region's commercial spaceport will be postponed until at least Thursday - and possibly until next month - while scientists try to fix a software glitch that forced Monday's scheduled takeoff to be scrubbed.

Teams still were troubleshooting a problem with the flight software for one of the two satellites to be carried by the Minotaur I rocket, so the earliest the launch could be rescheduled would be Thursday, said Keith Koehler, spokesman for NASA's Wallops Flight Facility, where the spaceport's launch pad is located.

"They're looking at the possibility of trying to make the corrections on the launch pad," Koehler said Monday afternoon. If that attempt fails, the satellite will have to be removed from the rocket to be worked on, and that would push the launch date into January, he said.

The original launch window ran through Dec. 20, with the NASA Wallops range closed during the last week of December for the holidays, Koehler said.

Earlier Monday, officials had said the launch would be postponed until at least Wednesday, and possibly for two to three weeks, because Air Force teams discovered an anomaly with the flight software for the TacSat-2 satellite while doing tests Sunday night.

The problem occured in software that controls the pointing of the satellite toward the sun so solar panels can charge batteries, said Neal Peck, TacSat-2 program manager. The software would have tilted the panels at a 45-degree angle instead of having them face directly into the sun, he said.

"So we would not be receiving sufficient power to the spacecraft to power all our systems and to conduct all our experiments," he said during a news conference at NASA Wallops two hours before the rocket was to have taken off at 7 a.m.

Asked what caused the problem, Peck said, "It's basically an error in the software."

"Anybody that has ever taken algebra has gotten a problem wrong because you slipped a minus sign somewhere," Peck said. "My guess is it was something along those lines."

The TacSat-2 satellite will test the military's ability to transmit images of enemy targets to battlefield commanders in minutes - a process that now can take hours or days. The Air Force envisions a system that would allow commanders to send questions directly to a satellite overhead and receive answers before the satellite passes back over the horizon.

Also aboard the rocket is the NASA's shoebox-size GeneSat-1 satellite, which carries a harmless strain of E. coli bacteria as part of an experiment to study the long-term effects of space on living organisms. The results could be useful for NASA's mission to Mars.

The Mid-Atlantic Regional Spaceport, or MARS, is one of only six federally licensed launch centers in the country. The Air Force will pay the spaceport $621,00 for the launch, spaceport director Billie Reed said Sunday.

Reed did not immediately return a telephone call seeking comment Monday.

The Virginia Commercial Space Flight Authority, a state agency created in 1995, built the launch pad in 1998 on land leased from NASA on Wallops Island on Virginia's Eastern Shore peninsula. Maryland later joined the commercial venture.

Orbital Sciences Corp. of Dulles built the rocket with two stages made from decommissioned Minuteman intercontinental ballistic missiles and two stages from Pegasus rockets.

Updated: December 16, 2006

The 69-foot Minotaur I rocket soared from the launch pad at 7 a.m. ET, after teams spent the week resolving a glitch in software for one of the satellites that had scrubbed a liftoff on Monday.

The delay added "a couple hundred thousand dollars" to the $60 million price of the mission, Air Force Col. Scott McCraw, the mission director, said Friday. Included in the total is the cost of the rocket and the two satellites and $621,000 the Air Force will pay the spaceport.

December 6, 2006

Software Testing is NOT "Breaking Things"

For some odd reason, I really don't like it when software testers say "I enjoy breaking things".

copyright © 2005 by Martin Hoffmann and Fred Mellender

When you test and find a bug, you haven't broken anything - it was already broken!  If anything, the developer who wrote the code broke it.

And now that you have found a breakage, your job has just begun.  You need to dig in much further:
  • Under what conditions does this break occur?  Under what conditions does it not occur?
  • What steps are required to reproduce this break?  And can you express those steps in simple terms so that developers and other testers can see it for themselves?
  • Can you gather related symptoms, logs, images, etc - to help make fixing this break simpler?
  • How long might it take to test a fix for this break?
  • Is this break indicative of a more general problem?  How will we know?
  • Does the presence of this break, or a fix for this break, mean we should re-execute some of our tests?  If so, which ones?
  • What risks does this break expose?
  • When did this break get introduced?
  • Was it possible to find this break sooner?  If so, why didn't we already find it?
  • Should we modify our testing processes to find breaks like this more effectively?
If you enjoy breaking things, perhaps demolition is a good profession for you.

But if you enjoy planning, conducting, and analyzing the results from controlled experiments designed to find existing (or potential) breakages, then software testing might be right for you.

December 3, 2006

One Answer to the Question About the Ratio of Testers to Developers

Often I hear questions like "What is the best ratio of Testers to Developers?" or "What is the industry standard ratio of Testers to Developers?"

As I have mentioned before, those questions really have no answer. The appropriate ratio depends totally on context - the industry, the company, the software, the projects, the budget, the role of the testers, etc, etc.

But, for those who really crave a ratio, and don't care about context, the current issue (December 2006) of Better Software Magazine provides an answer.

Hundreds of their readers answered a survey about their employment situation.

In the results, they present several charts - one of which is the "Ratio of Testers to Developers".

While precise numbers are not given, their chart appears to show the following:
  • about 5% report a 1:1 ratio
  • about 45% report from a 1:2 to a 1:4 ratio
  • just over 30% report from a 1:5 to a 1:10 ratio
  • about 10% report a ratio of 1:10 or more
  • just a few percent report 2:1, 3:1, 4:1 or 5:1 ratios
You should consider signing up for a free subscription at  Good stuff free!