Automation assessment model for evaluating automation based on the objectives / goals of automation:
Example Goals
- Maintainable – to reduce the amount of test maintenance effort;
- Relevant – to provide clear traceability of the business value of automation;
- Reusable – to provide modularity of test cases and function libraries;
- Manageable – to provide effective test design, execution and traceability;
- Accessible – to enable collaboration on concurrent design and development;
- Robust – to provide object/event/error handling and recovery;
- Portable – to be independent of SUT and be completely scalable;
- Reliable – to provide fault tolerance over a number of scalable test agents;
- Diagnosable – to provide actionable defects for accelerated defect investigation;
- Measurable – provide testing dashboard along with customisable reporting.
Subset of goals taken from ‘The Big Picture of Test Automation: Test Trustworthiness’ – Alan Page, Microsoft (2012) @alanpage
Example Score Card
- Platform Support – Support for multiple operating systems, tablets & mobile
- Technology Support – “multi-compiler” vs. “compiler-specific” test tools;
- Browser Support – Internet Explorer, Firefox, Google Chrome or any other browser based on web browser controls;
- Data Source Support – obtain data from text and XML files, Excel worksheets and databases like SQL Server, Oracle and MySQL;
- Multi-Language Support – localized solutions supporting Unicode;
- Test Type Support – functional, non-functional and unit (i.e. nUnit & MSTest);
- Test Approach Support – i.e. Hybrid-Keyword/Data-Driven testing;
- Results & Reporting Integration – including images, files, databases, XML documents;
- Test Asset / Object Management – map an object not only by its caption or identifier;
- Class Identification – GAP analysis of object classes (generic / custom) and associated methods capabilities based on complexity, usage, risk, feasibility and re-usability;
- Test Scenario Maintenance – manual effort (XPATH/regular expressions), self-maintaining (descriptive programming/fuzzy logic) or script less (DSLs);
- Continuous Build & Integration / Delivery Integration – with build & delivery solution;
- Future proofing – external encapsulation of test assets & associated meta data (XAML/xPDL), expandability (API/DLL/. NET), HTTP/WCF/COM/WSD and OCR/IR;
- License, Support & Maintenance Costs – pricing policy along with any hidden costs.
Example Test Approach Support
This can also be referred to as test automation framework/testware generation that is going to be used:
Example Test Approach cross reference chart below:
Taken from the “Hybrid Keyword Data Driven Framework” presented at the ANZTB in 2010 updated with the “Test Automation As A Service” presented at STARWest in 2012