Find your template, checklist or other download to help you in you tasks.
Find your way to be trained or even get certified in TMAP.
Start typing keywords to search the site. Press enter to submit.
A significant part of testing activities involves making quick judgements and getting first impressions. Often, these are followed by more detailed investigations and experiments. To support these activities, heuristics can be of great help.
In the following sections we describe examples of commonly used heuristics, both for static testing and dynamic testing. Keep in mind that the same heuristics may already be very useful during the creative process where documents and systems are created.
The results of a review (of documents, program code or whatever source) tend to rely on the reviewer’s experience. We would like to have some consistency over multiple reviews of various people. Heuristics can be of great help. The following sections show heuristics that can be used during refinements, reviews and other forms of static testing. Most heuristics for static testing can be used as a checklist to investigate if the test object complies with the various parts of the heuristic.
SMART is an acronym of the following terms: Specific – provide a specific description with sufficient details Measurable – quantify an indicator of progress and/or success Achievable – is it possible within the restrictions and conditions of the situation Relevant – does it relate to the objectives for the specific situation Time-bound – are timelines clearly stated
There are many variations and extensions of the SMART acronym, all have the same goal of assisting the reviewer.
INVEST is an acronym of the following terms, specifically meant to review user stories [Wake 2003]: Independent – of all other user stories Negotiable – not a specific contract for features (captures the essence, not the details) Valuable – to the customer Estimable – to a good approximation Small – so as to fit within an iteration Testable – in principle, even if there isn’t a test for it yet
More about INVEST can be found in section “Description of quality measures“.
When reviewing a text, such as a user story or requirements specification, look for words like always and never and try to imagine situations that do not comply with these absolute terms.
When reviewing a text, such as a user story or a requirement specification, ask yourself if the description is logical, this can be done in various ways. Is it consistent with other systems, with the previous version of the same system, with the expectations of the targeted users, with the corporate image of the company that will supply it to its customers, and most importantly: is it fit for purpose (i.e. does it solve the user’s problem).
Preparing dynamic tests can be roughly done in two ways: using test design techniques to create fully prepared test scenarios that are executed at a later stage, and using experience in an exploratory approach where one (or a few) test cases are created and executed right away.
The distinction between heuristic and test design technique is not a strict one. In her book Elizabeth Hendrickson describes CRUD as a heuristic [Hendrickson 2013] where we describe it as part of the test design technique Data Cycle Test (DCyT). (Note: CRUD is an acronym for Create, Read, Update, Delete, the actions that relate to the lifecycle of a data item.)
Below a few ideas on how to test numeric information (based on [Hendrickson 2013]).
Below a few examples of other heuristics.
You will find more examples in your day-to-day life at the office.