Posted by Ben Simo
Automation is of little value if it does not report useful information that can be quickly reviewed by testers.
Reported results should contain enough information to answer the following questions:
- What happened?
- What is the state of the application?
- How did the application get in that state?
- What automation code was executed?
- What automation data/parameters were used?
Some failures reported by automated tests will be errors in the system under test and others will be errors in the automation model or code. It is important that results point the reader to both.
I have found logging of the following information to be useful:
Test (test configuration information)
- Title
- Start Time
- Script File(s)
- Model Files
- Test Set
- Severity
- Environment
- Object Map
- Action Table(s)
- Oracle Table(s)
- Computer Name
- Operating System
- Tester
Actions (controlling the application)
- Source (where is the action defined?)
- Title
- Start Time
- Action Details
- Duration
- State Transition
- Automation Code
- Result Details
- Snapshot (screen capture, saved files, etc)
- Status (Pass, Fail, Inconclusive)
Oracles (validating the results)
- Source (where is the oracle defined?)
- Title
- State
- Automation Code
- Error Code / Description
- Validation Details
- Snapshot (screen capture, saved files, etc)
- Status (Pass, Fail, Inconclusive)
Messages (report useful information not directly connected to an action or oracle)
- Message
- Link
- Snapshot
Once you have decided what data to report, it is important to present the data in a manner that is conducive to efficient analysis. Results need to be both comprehensive and summarized (or linked) in ways that aid human testers and toolsmiths in quickly answering the questions listed above. A 10 hour automated test execution may be of little value if it takes another 10 hours to interpret the results.
Standardizing reporting and presentation is the first step to improving results analysis. Do not rely on your tool's built-in reporting. An expensive test automation tool should not be required to view results -- especially the incomplete results reported by many tools. Create a common reporting library that can be used by all your tests and use that library. Users of the reported results will not need to learn new formats for every project or test. Some suggested output formats are:
- HTML: Human users like color-coded well-formed results presented in HTML. A little JavaScript can be added to customize the experience.
- XML: Extensible Markup Language (XML) files can be processed by machines and can be displayed to human users when style sheets are applied.
- Tab-Delimited / Excel: Simple tab-delimited, CSV, or Excel tables are useful reporting formats that are easily processed by both people and machines.
- Database: Results written directly to a database can be easily compared to results from previous test executions.
Determine your needs and select the output formats that best meet those needs. If you standardize your reporting through a single small set of reporting functions, you can easily adapt reporting as your needs change.
0 Comments:
Post a Comment