Saturday, January 16, 2010

Evaluating Test Tools for Automation – Criteria to be considered

Before starting with any automation it is very common to evaluate various tools commercial and free tools available in the market and choose the best suited one for the application under test. This article describes the criteria to be considered while evaluating tools for automation and a methodology for evaluating the tools. This article does not aim to provide a comparison for all these tools. However, it clearly indicates the criteria to be considered for evaluation of testing tools.

Following are the most common criteria to be considered while evaluating tools for automation and any application. In addition to these there may some specific criterions to be considered based on the application under test.

Evaluation Criteria:

      1. Technology Support - Support for various controls and technologies used in the application such as iFrames, AJAX controls, PDF forms, tree view, cold fusion controls etc.,
      2. Ease of Script development/enhancement
      3. Reporting Results - The tool under evaluation should have a feature of producing result log which should be easy to analyze and pinpoint the defect.
      4. Test Independence - The failure of one test script should not have any impact on the entire test suite.
      5. Maintenance of Script - As there is very high maintenance overhead for the automated test scripts, the tool should provide ease of script maintenance.
      6. Multi browser support - The tool under evaluation should support different flavors of Windows OS and multiple browsers (at least IE6, IE7 and IE 8.0)
      7. Data Driven Capability - The tool should provide a means to have an external data store to store all the data and read/ write into the data store.
      8. Ease of Object Store Maintenance - There should be a means to have easy maintenance of Object Store. Object store is the repository of all the objects captured by the tool.
      9. Ease of Continuous Integration for nightly regression tests – The tool under evaluation should provide a easy means to integrate the tests to the build environment to have nightly regression tests conducted in a continuous integration environment
      10. Limitations – Limitations of the tool with respect application under test
      11. Advantages – Advantages of the tool with respect application under test
      12. Cost of Licensing - Tool should not be expensive and should have a flexible licensing option.

Evaluation Procedure:

  1. Select the tools that you want to consider for evaluation.The common tools considered for evaluation for automation include QTP, TestPartner, TestComplete, Visual Studio Team Edition, Selenium, Microsoft Lightweight Test Automation Framework, ArtOfTest WebAii etc..,
  2. For each of the tool considered for evaluation identify the pros and cons with respect to each of the criterion listed above
  3. Give a score for each of the tools for each of the criterion. Scale of scoring is
    • 1 - Below Average
    • 2 - Average
    • 3 - Good
    • 4 - Very Good
    • 5 - Out Standing
  4. Prepare a Score card of all the tools for each of the criterion considered. A sample score card is shown below. Please do not consider the data provided below as the actual comparison data. This is provided only to show an example.

    Evaluation Criterion\Tool

    QuickTest Pro

    Test Partner

    Test Complete

    VSTS

    Technology Support

    4

    3

    3

    2

    Ease of Script development

    4

    3.5

    3

    3

    Reporting

    4

    3.5

    3

    3

    Test Independence

    4

    3

    4

    4

    Script Maintenance

    4

    3.5

    3

    4

    Cross-Browser Support

    4

    3

    2

    2

    Data Driven Capability

    4

    4

    4

    4

    Ease of Object Store Maintenance

    4

    3

    3

    NA

    License Cost

    2

    3

    4

    3

    Final Score

    3.78

    3.28

    3.22

    3.13

  5. Provide a Rank for each of the tools considered based on the score provided earlier. Below is a sample example of tool ranking. This ranking does not represent the actual comparison of the tool rather it represents the suitability of the tools for a specific application it was evaluated.

    Tool

    Final Score

    Tool Rank

    QuickTest Pro

    3.78

    1

    TestPartner

    3.33

    2

    TestComplete

    3.22

    3

    Visual Studio 2008 Team Edition for Software Testers

    3.13

    4

  6. Recommend the best suited tool based on the Rank

Although, you can make a recommendation for a specific tool based on technical analysis with specific criteria. It is always not true that your recommendation will win the race. Most of the time it is a business decision based on the cost and budget and hence be open to work any tool and try to find workarounds for the issues.

--LN

Sunday, January 10, 2010

A Checklist for Performance Testing – Requirement Questionnaire

A Checklist for elicitation of the Performance Testing Requirements.

Performance Testing of an application involves the various phases outlined below. This article is an attempt to   provide a list of questionnaire that can help the test leaders/managers and testers to elicitate the performance testing requirements. This information is most important to know before starting with the performance test. Note that the questionnaire provided here is generic list of questions that is common to most of the applications. However, it need to be tailored based on the nature of the application under test. Usually I use this questionnaire by sending it to the stake holders of the application for answering. I have been using this questionnaire successfully for conducting performance tests.


Performance Testing


Performance Testing Requirement Questionnaire

  1. Please provide the URI's and credentials of the application for testing
  2. Please provide the test environment (Hardware and Software) configuration.
  3. Where is the test environment setup? – Inside the Firewall in a isolated LAN environment or outside the firewall.
  4. What technologies have been used to develop the application?
  5. What are the interfaces of the application? e.g., Payment gateways, web services etc.,
  6. Briefly describe the business/domain of the application.
  7. Is the application already in production? - Is this performance testing being conducted pre-production or post-production?
  8. Are the web server logs for the application available? Applicable only if the application is already in production.
  9. What are the critical workflows of the application to be considered as candidates for performance testing?
  10. What is the expected workload (number of simultaneous virtual users) to be tested?
  11. What is the average session duration of a user? - Average time a user would be logged into the application.
  12. How many hours in a day the application would be available/accessed by the users?
  13. Do you have any specific performance objective(SLAs) for this test? E.g. 1000 Invoices to be processed in a Day
  14. Is test data required for performance testing available in adequate quantity in the required format.
  15. Does the test team members have the  necessary privileges for the test server machines.
  16. Do you aware of any performance problems in the application that is experienced by the users or observed by other stake holders?

--LN