October 31, 2009

It's a Shame - Original Software's Software Testing Hall of Shame

My good friend Jim Hazen has been involved in Software Quality Assurance for a long time.

So noticing mistakes in websites comes naturally to him. And when he recently visited Original Software's corporate blog, he found a good one.

If you go to Original Software's Blog, you'll see that they periodically post about public software failures that they call the "Software Testing Hall of Fame".

Here's one:  http://www.origsoft.com/blog/archives/software-testing-hall-of-shame-survey-reveals-hidden-price-of-software-failures

These are good articles, and worth reading.

However, if you click the Original Software logo in the upper left corner, you see that they have improperly linked this logo, so you are presented with their 404 page:

When I visited, it said:

Sorry!, No Page Was Found

The page you where looking could not be found

So not only do they have a bad link, but bad grammar, bad spelling, and dicey punctuation as well.

Such a shame!

October 22, 2009

Bugs In The Wild - SQE Training

While reading an email from SQE, I clicked a link and ended up here:

You can see it for yourself at http://www.sqetraining.com/Events/register.asp?fx=show&sfx=etrdetail&tfx=showseminars&event=eSTF&regtype=etr

When I looked closely, I happened to notice two typos.  I have highlighted them in red.
Course Guarantee:Course attendees who do not pass the ISTQB Software Tester Certification-Foundation Level exam within 60 days of completing the course, will be provided with and additional 45 days of free access to the course.
Who's Behind the Training?
SQE Training is affilated with Software Quality Engineering, the publisher of StickyMinds.com and Better Software magazine.
I guess it wouldn't be so bad if this weren't the registration page for a Software Tester Certification course!

October 13, 2009

Perhaps They Should Have Tested More - Apple

I’m a PC.

I’m a Snow Leopard.

Hi there, Snowy! How are things?

I’m a Snow Leopard.

Oops. Looks like someone is stuck in “out-of-box” condition.   

I’m a Snow Leopard.

Perhaps they should have tested more?

I’m a Snow Leopard.

A critical bug in Apple's Snow Leopard OSX 10.6 reportedly can wipe out users' account information when they open and close "guest" accounts.

  • Snow Leopard has apparently suffered this problem from the beginning
  • Critical bug can wipe out users' account information
  • Data deleted upon logout
  • Complete loss of user data
  • Home directory is replaced with a new, empty copy
  • Was reported at least one month ago
  • Not corrected in the OSX 10.6.1 update
  • All the standard folders have reverted to an "out-of-box" condition

“Users start their Macs up as normal only to find they’ve logged in as ‘Guests’ on their machine – with all the files and data held on their Mac in their own user account seemingly deleted.”

Perhaps they should have tested more.

See also:

October 11, 2009

Book: Exploratory Software Testing

A few years back, I read How to Break Software by James Whittaker.  I liked it.  It wasn't wonderful, but it had a good batch of practical, useful tips.  Then I read How to Break Software Security and How to Break Web Software.  I liked them as well, but not as much.  Still, I figured I'd read James Whittaker's newest book Exploratory Software Testing.  Sadly, the downward progression of his writing continues.  This book is by far the worst of the bunch.

Chapter 1 - The Case for Software Quality is nothing more than "software is terrific, but it has bugs".   That's it, nothing more here.

Chapter 2 - The Case for Manual Testing talks a bit about testing, and tries to define exploratory testing.  Whittaker's definition has apparently caused some controversy among some well-known practitioners of exploratory testing, so here is his perhaps unique definition:
When the scripts are removed entirely (or as we shall see in later chapters, their rigidness relaxed), the process is called exploratory testing.
Whittaker then divides exploratory testing into two sections.  Exploratory testing in the small is that which guides the tester to make small, distinct decisions while testing.  Exploratory testing in the large guides the tester in how an application is explored more than how a specific feature is tested.

Chapter 3 - Exploratory Testing in the Small was, to me, the only useful chapter in the whole book.  Here Whittaker offers practical advice with examples for thinking about constructing test data, software state, and test environment.

Chapter 4 - Exploratory Testing in the Large is where Whittaker dives into what appears to be the point of the whole book - his Tourist Metaphor.  Apparently this is a big hit at Microsoft, but I found it pointless.  Think about every type of testing you have ever performed.  Now try to torture it into a phrase that ends with the word Tour.  There you go - that's the chapter. 

Just to give you a flavor, here's a list of all these Tours, and their variations:
  • The Guidebook Tour
    • Blogger's Tour
    • Pundit's Tour
    • Competitor's Tour
  • The Money Tour
    • Skeptical Customer Tour
  • The Landmark Tour
  • The Intellectual Tour
    • Arrogant American Tour
  • The FedEx Tour
  • The After-Hours Tour
    • Morning-Commute Tour
  • The Garbage Collector's Tour
  • The Bad-Neighborhood Tour
  • The Museum Tour
  • The Prior Version Tour
  • The Supporting Actor Tour
  • The Back Alley Tour
    • Mixed-Destination Tour
  • The All-Nighter Tour
  • The Collector's Tour
  • The Lonely Businessman's Tour
  • The Supermodel Tour
  • The TOGOF Tour
  • The Scottish Pub Tour
  • The Rained-Out Tour
  • The Couch Potato Tour
  • The Saboteur Tour
  • The Antisocial Tour
    • Opposite Tour
    • Crime Spree Tour
    • Wrong Turn Tour
  • The Obsessive-Compulsive Tour
Perhaps the idea of calling UI Testing a Supermodel Tour appeals to you, and will make for a richer, more productive set of tests.  I don't get it.  I just don't see any value here.  Doesn't testing have enough variation in language and definitions already, without adding this silliness?

Chapter 5 - Hybrid Exploratory Testing Techniques tells us that it's acceptable to combine scenario testing with exploratory testing.  Then it spends time rehashing each of the tours from Chapter 4 and tries to suggest a side trip for each. 

Chapter 6 - Exploratory Testing in Practice presents essays written by several Microsoft testers describing how they each used one or more of the tours in a testing situation.  It appears as if Whittaker instructed his charges to write a "What I did this summer"-style  essay, in the form of "How I used Tours to do my testing". 

Chapter 7 - Touring and Testing's Primary Pain Points tries to tell us (in a few paragraphs) how to avoid five pain points - Aimlessness, Repetiveness, Transiency, Monontony, and Memorylessness.  There's little real instruction here.  For example, we are told that in order to avoid repetitiveness, we must know what testing has already occurred, and understand when to inject variation.  Uhm, ok.

Chapter 8 - The Future of Software Testing has nothing at all to do with the other chapters, or exploratory testing.  It's basically Whittaker's gee-whiz vision of what might be possible (some day) in the future.  Perhaps.  Whittaker has given this talk in several webinars - it's simply rehashed here.

Since these chapters take up only 136 pages, and obviously aren't enough to fill out a real book, three unrelated appendices are bolted on.  A few pages about Testing as a career, and a bunch of pages lifted directly from Whittaker's blogs fill out the book to over 200 pages.

If you really want to learn about Exploratory Testing, this is probably not the place.  Exploratory Software Testing is fluff - stretched and tortured out barely to book-length.  There's not much in the way of learning here.

And if Microsoft testers are really instructed to "Tell me what kind of testing you did today, and make sure it ends with the word Tour", then I feel very sorry for them.

Also see:
And here's my Amazon review of this book, along with comments about my review:

October 8, 2009

Checking a List of Sites Using Xenu Link Sleuth

A new member at SQAForums asked:

"I test an online booking website which links to over 60000 client websites.

I am looking for a tool that will allow me to check their URL's to ensure that they are still valid.

I am trying to find a tool that will allow me to create a script that will reference the URL from a spreadsheet and send a query to that URL and get the HTTP Header response (to see if it returns a 404).

I have been looking for tools and the ones that I have found seem to check the link then expands out to check links on that page. I don't need (or want) that to happen. I just need a tool that checks the header response for the URL provided then moves on to the next one in the list."
I use Xenu Link Sleuth for this sort of thing. It's free, but not open source. It's also easy to use, and very fast.

Xenu can be set to check a Maximum Level of 0 - indicating that it should not spider the site, but just check the top-level URLs.

Here's how to do that:
  1. Create a text file containing all the URLs you wish to check, with each on a separate line
  2. In Xenu, select Option, Preferences... and set Maximum Level = 0 in the Options dialog
  3. Set any other Options you choose copyrightjoestrazzere
  4. In Xenu, select File, Check URL List (Test)...
  5. In the Open URL List dialog, open the text file containing the URLs that you created in Step 1
  6. Your test runs

You can find Xenu Link Sleuth at:

October 1, 2009

Book: Even Faster Web Sites: Performance Best Practices for Web Developers

After really liking Steve Souders' High Performance Web Sites: Essential Knowledge for Front-End Engineers, I really wanted to like his latest offering Even Faster Web Sites: Performance Best Practices for Web Developers.  I liked it, but not nearly as much.

While High Performance Web Sites is centered around Souders' "14 Rules" for better web performance, Even Faster Web Sites isn't rule-oriented.  Instead, it is organized into the three areas of JavaScript performance, network performance, and browser performance.  And six of the fourteen chapters were written by contributing authors, rather than by Sounders himself.

All this adds up to a somewhat uneven, and less widely applicable, set of ideas.

Here's the list of chapters:
  • Understanding Ajax Performance - written by Douglas Crockford
  • Creating Responsive Web Applications - written by Ben Galbraith and Dion Almaer
  • Splitting the Initial Payload
  • Loading Scripts Without Blocking
  • Coupling Asynchronous Scripts
  • Positioning Inline Scripts
  • Writing Efficient JavaScript - written by Nicholas C. Zakas
  • Scaling with Comet - written by Dylan Schiemann
  • Going Beyond Gzipping - written by Tony Gentilcore
  • Optimizing Images - written by Stoyan Stefanov and Nicole Sullivan
  • Sharding Dominant Domains
  • Flushing the Document Early
  • Using Iframes Sparingly
  • Simplifying CSS Selectors
A good book, but still a minor disappointment.