Scenario-Based Testing
Last Updated on Monday, 12 March 2012 10:52 Written by Administrator
Multi-view modeling languages like the Unified Modeling Language (UML) offer different diagram types to lower the complexity of describing software systems. Each diagram provides a distinct viewpoint of the system, allowing modelers to focus on certain aspects while temporarily abstracting from others. However, the diagrams complement one another, altogether providing a holistic representation of the system.
Model-driven engineering (MDE) lifts such models to first-class software artifacts, from which executable code is generated. As a consequence, typical development activities like testing may be shifted from the code level to the model level.
We therefore propose to use overlapping information inherent in multiple viewpoints of models for automatic testing. To this end, we describe a possible encoding of the problem for the model checker SPIN.
Scenario-Based Testing
Consider the following example modeled in the diagrams below. Two state machines show a typical behavior of a PhD student (PhD) and a coffee machine (CM). Both state machines change their states according to the messages they receive. Conditions for the state transitions are given in terms of transition labels. The transition labels consist of two parts: The left part denotes the action triggering the transition, and the right part indicates the action performed during the transition. If no triggering action is defined (“−”), the transition is executed unconditionally.
Starting in state Tired, the PhD student turns on the coffee machine and optimistically waits until it is ready. If she receives the error() message, she becomes desperate, then tired and tries again. Otherwise, she is happy, demands coffee, and waits until it is completed. The sequence diagram, depicted on the right of the state machines, models a forbidden scenario inside a neg fragment: After the coffee machine has sent an error, it receives a coffee request and then sends a “coffee complete” message. And indeed, a parallel execution of the PhD and CM state machines does not allow this sequence. They are thus correct with respect to the forbidden scenario modeled in the sequence diagram.
In model-driven engineering, such models are not only used as mere design documents but they serve as artifacts from which code is generated. It is important to detect faults in the models; otherwise they might propagate to the code. Designing test cases on the model level has been subject to extensive research, but often testing itself is transferred to the code level or requires a simulation engine. In this paper, we propose to use the communication scenarios modeled in sequence diagrams for testing purposes and to transfer the testing problem to a model checking problem by an adequate encoding. Testing is thus shifted to model level.
We propose to use the model checker SPIN and its input language PROMELA to verify whether a set of state machines fulfils a safety property described as neg fragment of a sequence diagram. To this end, we encode the state machine as a set of active PROMELA processes and the neg fragment as notrace assertion. In verification mode, SPIN checks whether the behavior specified in the assertion occurs on any execution trace of the processes executing in parallel. If this is the case, SPIN returns the erroneous execution path on which the notrace behavior occurred. Otherwise, it returns no error.
Download
We illustrate our approach based on the example modeled above. The PROMELA encoding of the state machines and the forbidden scenario may be downloaded here.