You are on page 1of 25

An Overview of

Verification and Validation

Jayantha Amararachchi
Senior Lecturer (HG)
1

Lecture Overview
V&V definitions
Different Test Purposes
Classification of Test Methods
Planning and Organisation
When to stop testing
Remaining Assessments
Future Submissions

Verification and Validation


Definition 1

Assuring the software system


meets user's needs

Verification vs. Validation


Definition 2

Verification: to establish the truth of


relationship between a software product
and its specification

Validation: To establish fitness or worth of


a software product for its operational
mission
4

Verification vs. Validation


Definition 3

Verification:

"Are we building the product right?"


The software should conform to its specification

Validation:

"Are we building the right product?"


The software should do what the user really
requires

Different Test Purposes

Testing vs. debugging

Defect testing and debugging are distinct


processes
Verification and validation is concerned with
establishing the existence of defects in a
program
Debugging is concerned with locating and
repairing these errors
Debugging involves formulate a hypothesis
about program behaviour then testing these
hypotheses to find the system error
7

The debugging process

Test
results

Test
cases

Specification

Design
errorrepair

Locate
error

Repair
error

Retest
program

What can we do about faults?


Three approaches to reducing the
number of faults in computer systems:
avoidance
detection & removal
tolerance

1. Fault Avoidance

Aim: to ensure faults are not introduced


into a system in the first place

Techniques:
precise

SW development process
defensive programming
reviews of requirements, design, and
implementations
10

2a. Discovery and Removal

Aim: to discover and remove faults

Techniques:
software

11

testing

2b. Run Time Detection

Aim: Need to identify errors at run


time, before they lead to system
failures

Techniques:
functionality

checks
Checksums..Etc.,
12

3. Fault Tolerance

Aim: to use redundancy to provide


increased safety for
computer systems
Techniques:
SW redundancy: information redundancy,
multiple computations
HW redundancy: passive or active
replication

13

Classification of Test
Methods

14

Static Vs. Dynamic

Software inspections Concerned with


analysis of the static system representation to
discover problems (static verification)
May

supplement by document and code analysis

Software testing Concerned with exercising


and observing product behaviour (dynamic
verification)
The

system is executed with test data and its


operational behaviour to be observed

15

White box vs. Black box

White box tests focus on the internal


structure of a component

Black box tests focus on the


input/output behaviour of a component

16

Planning and Organisation

17

V& V goals

Verification and validation should establish


confidence that the software is fit for the purpose

This does NOT mean that it is completely free of


defects

Rather, it must be good enough for its intended


use and the type of use will determine the
degree of confidence that is needed

18

V & V confidence
Depends on systems purpose and marketing
environment
Software

function

The level of confidence depends on how critical the


software is to an organisation
Marketing

environment

Getting a product to market early may be more


important than finding most defects in the program

19

V & V planning

Careful planning is required to get the most out of


testing and inspection processes

Planning should start early in the development


process

The plan should identify the balance between static


verification and testing

Test planning is about defining standards for the


testing process rather than describing product
tests
20

Testing has its own Life Cycle


Establishthetestobjectives
Designthetestcases
Writethetestcases
Testthetestcases
Executethetests
Evaluatethetestresults
Changethesystem
Doregressiontesting
21

Test Team

Professional
Tester
Programmer

Analyst

User

Test
Team

Configuration
Management
Specialist

System
Designer

toofamiliar
withcode

When to Stop Testing

The view that testing discovers first the trivial,


easy to correct, faults and the more difficult
ones later is WRONG

How can you know when youve discovered


most of them or the most important ones?

23

Remaining Assessments

Website (upload to the server) - 5 marks

Research Paper - 5 marks

Supervisor visits & log book

Final Presentation & Demonstration 15marks

Viva
24

-15 marks

- 5 marks

Future submissions

Website upload -before 2nd Oct.


(Collect Mid-Review Report) -before 9th Oct.
(from the supervisor)

25

Research Paper (to the supervisor) -12th Oct


Final Report (soft bound)
-9th Nov.
A CD containing all submissions- 9th Nov.
Final Presentation & Demo
-16th -20th Nov.
Viva
-16th -20th Nov.
Final Report (Hard Bound)- Depends, will inform
later

You might also like