Automation Evaluation

Automation assessment model for evaluating automation based on the objectives / goals of automation:

Example Goals

  • Maintainable – to reduce the amount of test maintenance effort;
  • Relevant – to provide clear traceability of the business value of automation;
  • Reusable – to provide modularity of test cases and function libraries;
  • Manageable – to provide effective test design, execution and traceability;
  • Accessible – to enable collaboration on concurrent design and development;
  • Robust – to provide object/event/error handling and recovery;
  • Portable – to be independent of SUT and be completely scalable;
  • Reliable – to provide fault tolerance over a number of scalable test agents;
  • Diagnosable – to provide actionable defects for accelerated defect investigation;
  • Measurable – provide testing dashboard along with customisable reporting.

Subset of goals taken from ‘The Big Picture of Test Automation: Test Trustworthiness’ – Alan Page, Microsoft (2012) @alanpage

Example Score Card

  • Platform Support – Support for multiple operating systems, tablets & mobile
  • Technology Support – “multi-compiler” vs. “compiler-specific” test tools;
  • Browser Support – Internet Explorer, Firefox, Google Chrome or any other browser based on web browser controls;
  • Data Source Support – obtain data from text and XML files, Excel worksheets and databases like SQL Server, Oracle and MySQL;
  • Multi-Language Support – localized solutions supporting Unicode;
  • Test Type Support – functional, non-functional and unit (i.e. nUnit & MSTest);
  • Test Approach Support – i.e. Hybrid-Keyword/Data-Driven testing;
  • Results & Reporting Integration – including images, files, databases, XML documents;
  • Test Asset / Object Management – map an object not only by its caption or identifier;
  • Class Identification – GAP analysis of object classes (generic / custom) and associated methods capabilities based on complexity, usage, risk, feasibility and re-usability;
  • Test Scenario Maintenance – manual effort (XPATH/regular expressions), self-maintaining (descriptive programming/fuzzy logic) or script less (DSLs);
  • Continuous Build & Integration / Delivery Integration – with build & delivery solution;
  • Future proofing – external encapsulation of test assets & associated meta data (XAML/xPDL), expandability (API/DLL/. NET), HTTP/WCF/COM/WSD and OCR/IR;
  • License, Support & Maintenance Costs – pricing policy along with any hidden costs.

Example Test Approach Support 

This can also be referred to as test automation framework/testware generation that is going to be used:

Example Test Approach cross reference chart below:

TestApproach_white

Taken from the “Hybrid Keyword Data Driven Framework” presented at the ANZTB in 2010 updated with the “Test Automation As A Service” presented at STARWest in 2012

Automation Ready

Automation Maturity Model Index (AMMi)

  • Solution assessment looking directly at the companies Automation challenges and possible opportunities for improvement.
  • Test Asset index to evaluate how the test artefacts are stored and the encapsulate the process that manages them.

The example here was how to take a company using no automation solutions with test assets stored in a legacy file template. In this case the AMMi index was around the 1.1 mark (based on the moment in time when the assessment was completed back in Q1 2012.):

“The best aspects of proven test approaches that have evolved over the past decade, and this echoes some of the changes towards more lean and agile business methodologies. Which are in a constant state of evolution – just as the underpinning technology evolves over time.”

With this in mind the best possible automation solution today may not be the right solution for every client. Therefore, sometimes half the battle is just starting on the Automation journey by taking them one step closer to becoming ‘Automation Ready’.

In this example the recommendation to follow the age old approach of Man’mation to encapsulate the manual execution phase using Hyper-Test to capture a journal that can be used to drive ‘Automation by Example’ in the future to in a generic XML/XAML/XPDL format extracting all the necessary test artefacts to allow them to be able to run all 10,000 scripts for regression purposes.

Check out the following presentation for further information:

Test Automation As A Service – Return on Investment