Automated Software Testing Life Cycle: Part II

Posted by Lisa Corkren in Automation| January 17, 2011 Automated Software Testing Life Cycle

The software testing life cycle has many phases. Recently, I wrote an article on the first phase which included scoping the project and Proof of Concept (POC). This included gathering all the information necessary to evaluate the time constraints and processes that the testing team or client want implemented in to a project.

The next phase is construction and verification. When considering all the steps to software testing, specification documents are a necessary part of the equation. In this phase test plans, and test scripts are created. The development team writes detailed specifications and starts coding the application thus it is time to begin static testing of specifications and code via reviews. Listed in the following sections are common reviews used in construction and verification techniques.

Inspection: Inspection involves a team, led by a trained moderator (not the author), the team formally reviews the documents and work product during various phases of the product development life cycle. The benefit of this process it that the peers experience and knowledge produce a higher level of quality. The bugs that are discovered during this review are documented and communicated to the next level in order to take care of them. (Graham, 2007)

Walkthroughs: A walkthrough is a step by step presentation that is less formal than an inspection. During the walkthrough meeting, the author introduces the material to all the participants in order to familiarize them with the application under test. Even though the walkthroughs can help in finding potential bugs, they are mostly used to gather information and to establish an understanding of content and to further the communication process. (Graham, 2007)

Buddy Checks: This is a type of proof reading that is the simplest type of review activity used to find bugs in a work product during the verification. It is useful for a person of a similar background to go through the documents prepared by the author; in order to find out if there are mistake(s) or bugs which the author couldn’t find previously.

Verification: Verification activities involved are: Requirement Specification verification, Functional design verification, internal/system design verification and code verification. Each activity makes sure that the product is developed the right way and every requirement, every specification, design code etc. is verified. (Graham, 2007)

Construction: For an automated process, after static testing has determined the project is ready to move forward we now need to evaluate which test cases can be automated.

The following questions are a guideline that will help us determine what will need to be automated:

  • Does this test have the potential for modularity and portability?
  • Is this test part of a regression or build testing?
  • Does this test cover most critical feature paths?
  • Does this test cover high risk areas?
  • Is this test expensive to perform manually?
  • Is this test part of a Performance test?
  • Are there timing-critical components and dependencies that are a must to automate?
  • Does the test cover the most complex area?
  • Does the test require many data combinations using the same steps?
  • Are the expected results constant?
  • Is the test outcome analysis overly time-consuming?
  • Does the test need to be verified on multiple software platforms?
  • Does the test automation ROI look promising and meet organizational criteria?
  • What are the clients' testing tools?
  • Which application feature can be used for testing (web, UI, GUI)?

In conclusion, not every test should be automated, and evaluating the list above will help to choose tests that will bring a greater ROI. Constructing and verification will help your team understand the goals of the software, help to build confidence in the application, and increase the morale of the testing team. Choosing to automate tests that meet these guidelines will save resources as well as time.


Lisa Corkren is a certified technical lead for DeRisk IT, Inc. She specializes in automated testing and project management. She has worked on numerous platforms of both Manual and Automated Testing. Her specialties include automated testing strategies and SmartBear’s TestComplete.

DeRisk IT Inc. specializes exclusively in all remote, onshore, offsite services related to Application Software and System Infrastructure Testing. The main focus areas are Functional, Performance and System Integration testing using both manual and appropriate automation techniques. Since its founding in 1998, the primary role of DeRisk IT Inc. has been to help corporate organizations forecast and plan for the most efficient IT projects with respect to risk avoidance and to implement appropriate testing solutions to achieve this. DeRisk IT Inc. specializes in risk analysis at the corporate level; and at a project level results in the proactive use of testing tools and methodologies to ensure on-time completion and to the right specification. DeRisk IT Inc. has established a position at the forefront of application testing, providing unique testing solutions such as performance, compatibility, security, usability and monitoring. DeRisk IT Inc. provides a full portfolio of services for all testing needs.