Lecturers:
There is a strong need for the use of advanced,
systematic techniques
and tools that support the system validation
process. [
Klaus Havelund ]
Software is becoming increasingly complex and there is a growing awareness within software engineering practice that both formal verification techniques as well as testing techniques are needed in order to deal with this growing complexity. This is in particular true in areas such as embedded systems used in consumer electronics, time dependent systems occurring in safety critical systems and communication protocols from the telecommunication industry.
The focus of this course is on techniques and software-tools that can be used to assess the quality and correctness of software systems. The first part of the course considers tools and techniques for formal verification of system designs. The last part of the course considers tools and techniques for testing system implementations.
In a (logical) first part of the course, particular emphasis will be put on our own real-time validation tool UPPAAL. Later in the course, the commercial tool visualSTATE will also be presented. In the course we will evaluate the usefulness of the various features of these tools. Emphasis will be on application and hands-on experiment with the tools on small case-studies.
In the (logical) second part of the course, emphasis will be on methods and techniques for testing.
In addition, the course will involve guest lectures from colleagues at other universities.
The aims of the course are that the students become familiar with the possibility of validating software systems based on formal models and specifications (in particular of real-time and embedded systems) with the aid of software tools for the verification and analysis, and that they learn practical techniques and methods for testing software systems.
Due to teaching capacity constraints and and study-board honorary we need to give the course jointly for several semesters.
No. |
Dat8 | SW8 | SP2 |
Lecture date |
Lecture room |
Exercise room |
Lecturer |
Slides |
Subject |
1. |
☻ |
☻ |
☻ | Feb 5 |
0.2.12 8.15-10.00 |
PC-Lab |
BN | Introduction | |
2. |
☻ | ☻ | ☻ | Feb 7 |
0.2.12 8.15-10.00 |
PC-Lab (0.2.18) |
BN | ||
3. |
☻ | ☻ | ☻ | Feb 12 |
0.2.12 8.15-10.00 |
PC-Lab (0.2.18) |
AD | ||
4 |
☻ | ☻ | ☻ | Feb 14 |
NO LECTURE |
Group Rooms +PC-Lab (0.2.18) |
BN |
Hand in March 18 |
|
5. |
☻ | ☻ | ☻ |
19 Feb |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
BN | testIntro | |
6. |
☻ | ☻ | ☻ |
21 Feb |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
ASk | whitebox |
Classical Test 1+2: (Test case design teknikker I: Whitebox Test case design teknikker II: Blackbox + Coverage) |
7. |
☻ | ☻ | ☻ | 26 Feb |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
BN |
tdd xUnit |
Test Driven Development + xUNIT |
8. |
☻ | ☻ | ☻ | 28 Feb |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
BN | fsm-based |
Model-Based Testing: |
13. |
☻ | ☻ | March 13 |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
AD |
Timed Games and Uppaal-TIGA Exercise 28+29 |
||
4 , 6, 11, 13 March, | BN Travelling | ||||||||
9. |
☻ | ☻ | ☻ | 18 March |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
BN | RT-Test |
Model-Based Testing: (Online Realtime Uppaal TRON) |
20 March | P�ske | ||||||||
10 |
☻ | ☻ | ☻ | NO LECTURE |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
BN |
Hand in MAY 5 |
|
11. |
☻ | ☻ | 21/4 |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
Ulrik Nyman |
|
||
12. |
☻ | ☻ |
22/4 cancelled due to lack of participation |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
Ulrik Nyman |
|
VisualState II | |
May 1 | St. Kr Himmelfart | ||||||||
14 |
☻ | ☻ | May 13 |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
Juhan Ernits |
|
||
15 |
☻ | ☻ | May 15 |
0.2.12 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
Juhan Ernits |
|
||
?? |
☻ | ☻ |
8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
KGL |
Performance Modelling: |
|||
?? |
☻ | ☻ | ☻ |
?? 8.15-10.00 |
Group Rooms +PC-Lab (0.2.18) |
Guest |
|
SW Test in Practice (TK-Validate) |
Dates: