Overview of Domain-Driven TestingDomain-driven testing is part of Design patterns. This pattern is appropriate when testers should be able to write and run automated tests even if they are not adept with the automation tools, or if you want them to start writing test cases for automation before the Software Under Test (SUT) has been completely developed. The pattern is not appropriate for very small-scale or one-off automation efforts. Testers develop a simple domain-specific language to write their automated test cases. Practically this means that actions particular to the domain are described by appropriate commands, each with some required parameters. As an example let's imagine that we want to insert a new customer into our system. The domain-command will look something like this - New_Customer (FirstName, LastName, HouseNo, Street, ZipCode, City, State) Now testers only have to call New_Customer and provide the relevant data for a customer to be inserted. Once the language has been specified, testers can start writing test cases even before the SUT has been implemented.
What is the Domain-Driven Testing Pattern?A test automation pattern is a method of taking care of an issue or problem in test automation that has worked for some individuals. In other words, a pattern is an expert knowledge proven by repeated experience.A pattern is a general reusable explanation to a commonly occurring problem within a given context. They widely used in software development. A pattern is not
- a finished solution that can "plug in" directly to the situation
- a step-by-step procedure
Classification of Domain-Driven PatternsPatterns have been divided into categories depending on their scope -
- Process Patterns
- Management Patterns
- Design Patterns
- Execution Patterns
Process PatternsProcess patterns show how the test automation method should be set up or how it can be improved.
Management PatternsManagement patterns show how to manage test automation pattern both as an independent project or integrated into the development process.
Design PatternsDesign patterns show how to design the test automation testware so that it will be efficient and easy to maintain.
Execution PatternsExecution patterns tell the best way to take care that test execution is simple and reliable.
Domain-Driven Test Automation IssuesAn Issue is a problem that testers and test automation experts experience when attempting to do test automation. They are things that take longer than expected, that keep you down in what want to automate, or that are too much trouble.
- Management Issues - Issues that happen when management has not given the necessary support (budget, resources, team).
- Process Issues - Problems that happen when the test automation process has not reached a certain maturity.
- Design Issues - Issues that occur when an efficient testware architecture and maintainability are not built in from the very beginning.
- Execution Issues - Problems that happen when the automated tests are run (unpredictable results etc.).
Problems in Domain-Driven Test Automation Pattern
- High Maintenance Cost - Maintenance of the test automation scripts takes time and costs more than it's worth. When software changes, e.g., window sequence changed, tests fail. More effort to edit the existing criteria than to start again. (Capture/replay used)
- No Previous Test Automation - First time automating, or automation has failed, and you are beginning once more.
- Stalled Automation - Automation has been attempted, but it never "got off the ground" or has now fallen into the delay.
- Unrealistic Expectations - Management has unrealistic expectations about what test automation can & can't do.
Where did these automation patterns come from?An Experiences book published in 2012. Seretta contributed Ch 21 is a developer, used to using patterns. When she read the other chapters, she thought: "Patterns! Patterns!" She wrote up test automation patterns as a book & asked Dot to join her.
ImplementationTo implement a domain-specific language, scripts or libraries must be written for all the desired domain-commands. This is usually done with a Test Automation Framework that supports Abstraction Levels. There are both advantages and disadvantages to this solution. The most significant advantage is that testers who are not very adept with the tools can write and maintain automated test cases. The downside is that it needs developers or test automation engineers to implement the commands so that testers are entirely dependent on their "goodwill." Another negative point is that the domain libraries may be implemented in the script language of the tool so that to change the device may mean to have to start again from scratch.
Testware Architecture in Test Automation PatternsThere are many artifacts in test automation, and all of them need to have a place to "live." Scripts, data sets, expected results, actual results, utilities, etc. For efficient working, it is important to ensure that testers and automation know where to find artifacts and can access them quickly and easily as they work. For example, a standard structure for testware permits the information of where artifacts are to be found or placed to be made known to the tools, for example, when tests are to be run, or test results are reported. This makes the job of interacting with the tools simpler and less error-prone because the "where to find it" information is already built into your tools or framework. Whichever tool you use will have some architecture as the default for that tool. One option is to adopt the tool's architecture. Although this may be the easiest option, in the beginning, it could have long-term negative consequences, as you will then be "tied" to that tool's current architecture. It is much better to design the architecture of your tests so that it supports the way you want to use your automation. It is relatively straightforward to adapt the testware to the needs of the tool (e.g., copy a script to the place where the device expects to find it) - and this can and should also be automated. When you design your architecture, you need to decide where you will store various testware artifacts, i.e., what filing structure will you use, what naming conventions will you use for files and folders, etc. Distinguishing between the test materials and the test results is essential. The test materials are those artifacts that should be in place before a test is executed, such as the test inputs, expected results, any set-up that needs to be done, required data before a test runs, environment settings, and the description/documentation for that test. Many of these artifacts will be used by more than one test - in this case, they should be accessible to all tests that use them, but there should only be one master copy (except for different legitimate versions). These artifacts must be under configuration control to prevent people from over-writing edits, for example. The test results include everything that is produced by the system or software when a test is executed, including the actual results, log files, difference files (between actual and expected results), etc. There are usually many sets of these for a given test, at least for a test that is run many times. These need to be stored in a different way to the test materials, since one set of materials (for a single test) will produce a new copy of the test results each time it is run. If tests are run daily or more often, these sets of results will soon build up, and you don't want them "clogging" the storage of your tests. It is worth asking whether you need to keep all of the test results each time - you may want to keep the test log, to prove that the test was run, but if the test passes, then the actual results has correctly matched the expected results, so why keep two copies of the same thing? Your testware architecture should implement ABSTRACTION LEVELS. Make sure that you design the structure of your scripts so that tool-specific scripts are kept to a minimum. To make your automation accessible to a wide variety of testers, including those who are not programmers, make sure that you enable them to write and run automated tests smoothly.
Advantages of Domain Driven TestingThe advantages of Domain Driven testing are given below -
- Low maintenance cost
- Improved communication
- Enhanced code reusability
- Low maintenance effort and time
- Increased reliability
- Single ownership
- Single point of change
- Save time & effort
- Easy to implement, Maintain, debug and scale.