Showing posts with label Errors. Show all posts
Showing posts with label Errors. Show all posts

September 30, 2008

The Antonym of Testing


"... one usually encounters a definition such as, 'Testing is the process of confirming that a program is correct. It is the demonstration that errors are not present.' The main trouble with this definition is that it is totally wrong; in fact, it almost defines the antonym of testing."

- Glenford Myers,

Software Reliability: Principles & Practices, 1976

People keep telling me that testing is a validation activity -- that the purpose of testing is to validate that the software meets all the specifications, has no errors, meets performance SLAs, meets expectations of anonymous users, or some other lofty goal.

I read about testing processes designed to validate software. I use testing tools built to support validation. I listen to service companies pitch testing services to validate software. I read about testing metrics built on the assertion that software systems can be proved correct. I attend testing presentations explaining the presenters' best practices for validation.

The trouble is that we cannot prove software correct. We cannot prove the absence of bugs. We cannot test every possible state and input. We cannot evaluate every possible output. We cannot fully understand the desires of stakeholders. We cannot prove that customers will be happy. We cannot prove that a software product will solve the problems it was built to solve. If all this were possible, I suspect insurance companies would find a way to make a profit selling software quality insurance.

"If you think you can fully test a program without testing its response to every possible input, fine. Give us a list of your test cases. We can write a program that will pass all your tests but still fail spectacularly on an input you missed. If we can do this deliberately, our contention is that we or other programmers can do it accidentally."

- Cem Kaner, Jack Falk, and Hung Quoc Nguyen,
Testing Computer Software, Second Edition, 1999

Now, thirty-two years since Glenford Myers called testing to prove correctness the opposite of testing, we're surrounded by testing practices and tools based on proving correctness. The myth of proving correctness is alive and well.

Activities designed to try to prove correctness are the antonym of testing.

So if testing is not validation, what is testing? Testing is investigation; and communicating useful information about quality to decision makers.

"Testing is the process by which we explore and understand the status of the benefits and the risk associated with release of a software system."

- James Bach,
James Bach on Risk-Based Testing, STQE Magazine, Nov 1999


"Testing is done to find information. Critical decisions about the project or the product are made on the basis of that information."

- Cem Kaner, James Bach, Bret Pettichord,
Lessons Learned In Software Testing: A Context-Driven Approach, 2002


"A software tester’s job is to test software, find bugs, and report them so that they can be fixed. An effective software tester focuses on the software product itself and gathers empirical information regarding what it does and doesn’t do. This is a big job all by itself. The challenge is to provide accurate, comprehensive, and timely information, so managers can make informed decisions."

- Brett Pettichord,
Don't Become the Quality Police, StickyMinds.com, 2002


Once we admit that we cannot prove the software correct, we can refocus our efforts on finding useful quality-related information. Instead of pretending to assure quality or validate correctness, we can gather and communicate useful information. Investigate the software. Find information about threats to the quality of the systems under investigation. Communicate that information in terms that matter to stakeholders. Help managers make informed decisions.

May 20, 2008

Is There A Problem Here?




msn video

To use this product, you need to install free software

This product requires Microsoft Internet Explorer 6 with Microsoft Media Players 10 and Macromedia Flash 6 or higher versions, or Mozilla Firefox 1.5 with Macromedia Flash 8, or Safari 2.0.4 with Macromedia Flash 8. To download these free software applications, click the links below and follow the on-screen instructions.

Step 1: download firefox 1.5
download firefox 1.5

Step 2: Download Macromedia Flash Player
Macromedia Flash player is free to download.
If still having problems, uninstall Flash and then re-install Flash.


Once the installations are complete, reload this page.

August 18, 2007

Failure Usability

One of my pet peeves about software is bad error messages. In my view, a bad error message is one that does not tell the user how the error impacts them and what they need to do in response to the error. Too many messages fail to communicate this information in terms that the software's user is going to understand. Too many of these messages are written for developers, not users.

There is a place for error logging in terms that help developers and testers troubleshoot and fix problems. This information is often best written to log files, not displayed in the user interface.

Pradeep Soundararajan and I recently discussed some of our experiences with error messages. You can listen to excerpts from this conversation using the link below.



After the above conversation, Pradeep walked me through an exercise that demonstrated a case in a popular office application in which I, the user, was not sufficiently informed that an error was occurring. It was obvious that the actions I was trying to perform were not functioning but the software gave me no clear indication of why it did not work as I expected.

Good testers recognize the need to include "negative testing" in their search for significant bugs. Testing for errors is a common part of functional testing. Let's go beyond functional testing of errors. Let's test errors for usability.

Here is an error testing mnemonic I created after our conversation.

  • Functional
  • Appropriate (or Apropos*)
  • Impact
  • Log
  • UI
  • Recovery
  • Emotions

Functional
Does the error detection and reporting function as required? Are errors not detected that should be detected? Are errors reported? Do error dialogs function as expected? Do the buttons work?


Appropriate (or Apropos*)
Are errors detected and reported in an accurate and timely manner for the intended audience? Are errors reported as soon as an error condition is met? Are warning messages displayed while there are enough resources to remedy the problem? Is a user allowed to waste time and effort only to be told that their work cannot be applied? Is the text of a message accurate? Does the text convey the situation to the intended audience? Is the error described in terms that will be understood by the intended audience?

Impact
Is the impact of the error sufficiently communicated to the user? Does the message contain too little information for the user to understand what occurred and how it impacts what they were attempting to do? Does the message contain extra information that distracts from communicating the impact?

Log
Does technical information need to be logged for support, system administrators, developers, or testers? Will this log information be available if the user waits to contact support? Are log messages standardized to allow for automated information mining? Are logs detailed enough to facilitate troubleshooting? Are errors logged that add no value? Is there too much logging? Does excessive logging negatively impact performance and disk space? Does excessive logging complicate error investigation?

UI
Are users given some indication that what they attempted to do failed? Are user interface messages worded for the intended audience? Are user interface error messages consistent with the look and feel of the software? Are error messages consistent with other activity that causes the same error elsewhere in the application? Are errors communicated in an efficient manner? Does a user need to click away excessive dialogs? Is this error best communicated as an error dialog? Is this error best communicated as text added to the window? Is this error best communicated audibly?

Recovery
Does the error message tell the user how to recover from the error condition? Does the software facilitate recovery? If needed, is contact information provided? Is the user prompted through the recovery or left to figure it out on their own?

Emotions
What emotions are likely to be raised by the error message? Does the information in the message add to a user's frustration or help quiet it? If a user is being told that they need to pay more to use a feature, does the message encourage them to upgrade or does it encourage them to find a competitor's product? Does the error message cause more confusion?


Here's an error message Pradeep gave me to help test the mnemonic. What faults can you find in the error message using the mnemonic?





The next time you see an error message -- or don't see one that you should -- don't stop at functionality. Check the rest of the FAILURE. (And look here for a PowerPoint show demonstrating the mnemonic.)


PS: While attempting to save this post, Blogger gave me the following error. This error fails many of the tests above. The second save button distracted me from the light grey error message. Trying again did not fix the problem. It was not clear if the error meant that my text had been saved or not. I finally had to copy the HTML of the post and paste it into a new Blogger session. Bad error reporting is easy to find. :)


* UDATE: 18 Aug 2007 @ 6:32 PM: Michael Bolton suggested that "Appropriate" would be more appropriate than "Apropos". After some consideration, I agree. Thanks Michael.