Before starting to automatize tests, you need to take a step back and define few aspects.
Consider the test team members
Depending on the test team members you need to think what kind of test scripts you are going to adopt. Depending on the tools like Tosca, UFT, Ranorex or Selenium, there are 3 ways of scripting:
Record : This is a an easy way, but could be expensive to maintain. The profiles adopting this way of working are the Business Users.
Script : This a fair quick way to script, and the maintenance is less expensive then using Record. It requires a Test Automation Expert to develop and to maintain them. Any tool is able to achieve the automation.
Data Driven and Workflow : It requires both profile : business user and a test automation expert. This could take more time to implement but the maintenance is way easier and also scripts could be easily reused by the business user without requiring an expert.
Ranorex is supporting the data driven part. But no workflow management.
Consider Manual vs Automated testing
You should not automate everything and keep some test cases for manual testing. To define which one should be automated, keep in mind the following:
How much time the same script need to be repeated? If you are doing more than 4 times, you should automate.
If your test are UAT (User acceptance test), exploratory test or complex with many asynchronous processes, you should consider manual test.
Also when you prioritize the test cases based on their business criticality. More a test case is critical, more it should be automatized.
Consider Test Data Strategy
Ask yourself these following questions:
Do you need real data with a state data based on a process? That means your data will change from the beginning until the end of the test. You may need a copy from a production database with data anonymization and a way to reset it. This could be a complicated task. You could also use advanced tool for Service Virtualization to avoid to copy database and have a instant “refresh”. But this path is also quiet expensive in license or in man days.
Does your data drive the test? In other words, do you need more a copy of the database, but also “properties” data to drive all the test scenarios?
Define the right set of data that covers all the business risks. When you generate all the test scenarios for the same workflow, you should ensure to optimize the number of scenarios without being systematic, otherwise it could fall in an exponential number of scenarios. See an example below:
Consider Test Script Maintenance
When you create automated test you will require later to maintain them for 3 main reasons :
“Repair” test cases based on business changes.
Enhance test cases based on issue found in UAT (User acceptance test) or later in Production.
Optimize or correct test based on false positive errors.
Consider the right scripting tool
Based on all your SUT (System under test) technology. You need to find the right scripting tool depending on the features and your budget.
This POC (Proof of concept) tend to analyse different service virtualization available on the market.
POC = Proof of concept
Service Virtualization = “A test double often provided as a Software-as-a-Service (SaaS), is always called remotely, and is never working in-process directly with methods or functions. A virtual service is often created by recording traffic using one of the service virtualization platforms instead of building the interaction pattern from scratch based on interface or API documentation.” (ref1)
Stub = “A minimal implementation of an interface that normally returns hardcoded data that is tightly coupled to the test suite.” (ref1)
Mock = “A programmable interface observer, that verifies outputs against expectations defined by the test.” (ref1)
Main feature to consider in Service Virtualization
Multi protocol (not only HTTP).
Record record real transaction to re-play them for testing purpose.
We will use the API from ft-demo-website project => API. We will explore HPE SV and We will virtualize 2 methods from the API:
Read list of books
Add a book
Read list of books
Original : http://../api/listBooks.php
This GET Rest Api provides a list of books
We need to configure HPE Sv to learn the service:
Go to Add a new virtual service…
Select I don’t have a service description & click Next
Select REST & click Next
Select agent HTTP Gateway because we want to simulate the endpoint.
Add in Real Service > Endpoints the original rest api as specified above & click Next
Start to learn by selecting Learn and create a call using the Virtual Service not the Real service
System is learning thanks to this unique call. You will see that you have been redirected to the real service. When it the system is learning it captures the call, redirects it, plays as proxy and records the answer.
Switch to Simulate now & do again the call using again Virtual Service url. As you can see, you are not redirected and it is providing the same content. Now the service is kind of cached. We can say this service is stubbed as per this definition.
Original : http//../api/insertBook.php?title=toto&author=tutu&edition=Sun%20Edition
This GET Rest Api adds a book with the content defined in the param. You create a new Virtual service as the step above. Then put the 2 services in Learning mode.
Before starting, reset the db using http//../api/createTable.php (this will not simulated), and start to alternate the call (list, add, list, add etc…) then switch to Simulate and redo the same pattern. You will see the system is redoing the same outputs.
Now lets trick the system. Call only listBooks.php, you will notice he will loop as per the pattern even we don’t use insertBook.php! We you could face some issue for complex Business Process. How you could solve it? By using script or data driven set.
Beyond a simple stub
After recording you can more data scenarios. You have 2 options that could be used even together:
Data driven set managed with an Excel file and you define all the possible request and response required for your test.
Data test strategy
If you are stuck with some poor data or it is hard to use test data. It could be a good tool to put in place and have a new data test set. Based on your test scenarios, you create all the set of data you want to have in requests and answers and you will have fairly quick your data test set. This should be consider when you are doing your test strategy.
It works perfect if you are doing automated test but if you are doing manual explanatory tests it will be wired.
If you need to put more business logic it is still possible. For each service virtualized, you can add scripts with business logic. But how far you want to replicate the business logic? This could be complicated if you need a perfect replication, that means you will have 2 technologies to maintain with the same business logic. You should avoid creating scripts and answer as much as possible with Data driven set. This will be more easy to maintain.
Actually I’m digging in different SV (=Service Virtualization) Solution like HP SV, OSV from Tricentis and CA SV. In this context, it allows you to cut from production the real data and play with test data.
What commercial SV can do for you?
Record transactions : You record all the transactions you need for your test scenarios, then the system is playing back all the responses as required.
Scripting the service with a business logic if you need some complex response.
Data driven set : Allows you to fill an Excel spreadsheet or any data container with all the data you require for your test scenarios. Your data set need to be “stateful business process” oriented, that means during the data lifecycle, the object could change state and value.
This last feature will be a good candidate if you are stuck with a legacy or new applications in order to put in place a test data set. The majority of the commercial tools could work with different protocol (http/https, MQ, Filesystem, Ftp/s, etc…). And if the price is a show stopper and you require only http/https, some open source tools are also good candidates. Here you can find a good list of open source and commercial tool.