Professional Documents
Culture Documents
QA Training
www.valuelabs.com
Agenda
Overview
Introduction
www.valuelabs.com
Overview
www.valuelabs.com
Contents
Introduction to Software & Software Engineering
SDLC Models
Waterfall
RAD
Incremental
PDCA
www.valuelabs.com
Software
Software Engineering
www.valuelabs.com
www.valuelabs.com
www.valuelabs.com
www.valuelabs.com
PDCA
www.valuelabs.com
Introduction
www.valuelabs.com
10
Contents
Software Quality Via Quality Attributes
QA, QC and Testing
Role of QA in SDLC
Need for Testing
Testing Life Cycle
www.valuelabs.com
11
Software Quality
Quality Attributes
Safety
Modularity
Security
Complexity
Reliability
Portability
Robustness
Usability
Understandability
Reusability
Testability
Efficiency
Adaptability
Learn ability
Conformance to
requirements
www.valuelabs.com
12
Quality Assurance
Quality Control
Testing is one example of Quality Control, there are others such as inspections
www.valuelabs.com
13
Lowered Risk
Reduced Rework
Serving as a Differentiator
www.valuelabs.com
14
Testing
15
Role of QA in SDLC
Requirements &
Specification
Design
Coding
Testing
Delivery
16
www.valuelabs.com
17
www.valuelabs.com
18
Test
Methodologies
&
Test Plan
www.valuelabs.com
19
Contents
Testing Methodologies
Software Testing Classification
Test Plan
Need for Test Plan
Problems
Contents of a Test Plan
Test Plan Template
To Conclude
www.valuelabs.com
20
Testing Methodologies
There are numerous methodologies available for testing a software.
The methodology we choose depends on factors such as the nature
of project, the project schedule, and resource availability
www.valuelabs.com
21
www.valuelabs.com
22
Static Analysis
Dynamic Analysis
White Box
Black Box
Manual
Automation
23
www.valuelabs.com
24
It is the testing that takes into account the internal mechanism of system or
component.
This is done only for the applications for which the code has been provided for testing.
It is checking the functionality of the module which include checking the module
internals i.e. the coding (the loops and conditional statements) whether the loops are
executed in a proper way.
Also known as structural testing, clear box testing and glass box testing
www.valuelabs.com
25
Path Testing: Ensures that all independent paths through code module have
been tested
Branch Testing (Conditional Testing): Test method which aims to ensure that
each possible branch from each decision point (e.g. "if" statement) is executed
at least once, thus ensuring that all reachable code is executed
www.valuelabs.com
26
Disadvantages:
27
www.valuelabs.com
28
The test is unbiased because the designer and the tester are independent of
each other.
The tester does not need to acquire knowledge of any specific programming
languages.
The test is done from the point of view of the user, not the designer.
Disadvantages:
The test can be redundant if the software designer has already run a test
case.
The test cases are difficult to design.
Testing every possible input stream is unrealistic because it would take a
inordinate amount of time; therefore, many program paths will go untested.
www.valuelabs.com
29
www.valuelabs.com
30
31
Performance testing: The process of testing the run time performance of the
software, to see whether the system is performing up to the clients performance
requirements. Some of the aspects includes
Connection time
Response time
Send time
Process time
Transaction time
Load testing: Is to define the maximum amount of work a system can handle without
significant performance degradation.
Stress testing: Is the process of determining the ability of the system to maintain a
certain level of effectiveness under unfavorable conditions. Evaluates the extent to
which a system keeps working when subjected to extreme work loads or when some of
its hardware or software has been compromised
www.valuelabs.com
32
Acceptance Testing:
Is to make sure the software works correctly for intended user(s) in his or her normal
work environment (s).
Alpha test-version of the complete software is tested by customer under the supervision
of the developer at the developer's site. The developer observes the usage of the system,
records usage problems and errors, analyzes the system for bugs to resolve them.
Beta test-version of the complete software is tested by customer (or selected personnel)
at his or her own site without the developer being present. The system is tested using real
data in the real user environment that cannot be controlled by the developer. All problems
encountered by the users would be reported back to the developer at regular intervals.
User Acceptance test-is the final testing process that occurs before a new system is
accepted for operational use by the client. It is to get a confirmation from the client of the
object under test, through trail or review, that the system meets his requirement
specifications
www.valuelabs.com
33
Monkey testing: is to test an application with stochastic inputs without any specific
tests in mind.
Tests are not logical and there is no intent of learning the system.
www.valuelabs.com
34
Regression testing
www.valuelabs.com
35
www.valuelabs.com
36
Test Plan
www.valuelabs.com
37
www.valuelabs.com
38
Problems
When a project does not identify its overall approach to testing, then
it gives to rise to problems listed below:
www.valuelabs.com
39
Scope
Scope clauses define what features will be tested. An aid to doing
this is to prioritize them using a technique such as MoSCoW
www.valuelabs.com
40
Resource
Resource clauses give the overall view of the resources to deliver
the tasks.
www.valuelabs.com
41
Time
Time clauses specify what tasks are to be undertaken to meet the
quality objectives, and when they will occur.
www.valuelabs.com
42
Quality
Quality clauses define the standard required from the testing activities.
Introduction: A high level view of the testing standard required, including what
type of testing it is.
Approach: The details of how the testing process will be followed.
Item Pass/Fail Criteria: Defines the pass and failure criteria for an item being
tested.
Test Deliverables: Which test documents and other deliverables will be
produced.
www.valuelabs.com
43
Risk
Risk clauses define in advance what could go wrong with a plan and the measures
that will be taken to deal with these problems.
www.valuelabs.com
44
Introduction
Test Items
Features to be Tested
Approach
www.valuelabs.com
45
Test Deliverables
Testing Tasks
Environmental Needs
Responsibilities
Schedule
Approvals
www.valuelabs.com
46
To Conclude
The role of a test plan is to guide all testing activities. It defines what is to be tested
and what is to be overlooked, how the testing is to be performed (described on a
general level) and by whom.
www.valuelabs.com
47
Test cases
&
Test Reports
www.valuelabs.com
48
Contents
What is a test case?
Why are test cases written?
49
A set of test inputs, execution conditions, and expected results developed for a
particular objective, such as to exercise a particular program path or to verify
compliance with a specific requirement.
Documentation specifying inputs, predicted results, and a set of execution conditions
for a test item.
The main goal of designing any test case is to find bugs in the software under test.
Thus, it is important that the tester designs test cases that have the highest likelihood
of finding the most errors with a minimum amount of time and effort.
Note that the process of developing test cases can help find problems in the
requirements or design of an application, since it requires completely thinking through
the operation of the application. For this reason, it's useful to prepare test cases early
in the development cycle if possible.
www.valuelabs.com
50
www.valuelabs.com
51
Test Case ID
Procedure
Expected Result
Pre Conditions
Requirement# etc
www.valuelabs.com
52
Accurate
Reviewable
Traceable
Reusable
Maintainable
53
www.valuelabs.com
54
www.valuelabs.com
55
Test report
www.valuelabs.com
56
www.valuelabs.com
57
Release metrics
www.valuelabs.com
58
Test results
Test result document consists of the result of each test case execution. The
contents of the Test Result document:
Case Summary
Test case ID
Test Type
Expected Result
Actual Result
Bug ID
Notes/Comments
www.valuelabs.com
59
www.valuelabs.com
60
Communication
&
Documentation
www.valuelabs.com
61
Contents
Communication through E Mail
MS Word
MS Excel
MS Power Point
MS Paint
www.valuelabs.com
62
www.valuelabs.com
63
MS Word
Click Bookmark from the Insert menu to get the Bookmark window.
Enter a name for the bookmark and click the Add button.
Click Hyperlink from the Insert menu to get the Insert Hyperlink
window.
Select Place in This Document to get a list of all of the bookmarks in
the current document.
Select a bookmark and click OK in the Insert Hyperlink window.
To insert a table using the Toolbar, click the Insert Table icon on the
toolbar.
www.valuelabs.com
64
MS Word
www.valuelabs.com
65
MS Word
Click OK.
To insert a picture
www.valuelabs.com
66
MS Excel
Click the AutoSum icon on the Toolbar. The result of the summation
appears in the cell adjacent to the selection.
To insert a comment
To create formulas
Select the formula you want, and follow the on-screen instructions.
www.valuelabs.com
67
MS Excel
To insert a chart
Select whether you want to put the chart in the current worksheet or in a
new worksheet.
www.valuelabs.com
68
MS Power Point
After you open Microsoft PowerPoint, a screen pops up asking if you would
like to create a New Presentation or Open An Existing Presentation.
AutoContent Wizard creates a new presentation by prompting you for
information about content, purpose, style, handouts, and output. The new
presentation contains sample text that you can replace with your own
information. Follow the directions and prompts that are given by Microsoft
PowerPoint.
Design Template creates a new presentation based on one of the
PowerPoint design templates supplied by Microsoft. Use what is already
supplied by Microsoft PowerPoint and change the information to your own.
www.valuelabs.com
69
MS Paint
Open Paint.
From the Edit menu, click Paste. The image of the screen or the active
window gets pasted on the new work area.
Select the entire image or a portion of the image with the help of the
Select tool.
Click Copy or Cut from the Edit menu.
Click Paste in the document where you want the entire image or the
portion of the image to appear.
You can highlight an error or a button with the help of Rectangle or Ellipse
tool.
www.valuelabs.com
70
It is a Revision Control tool for managing multiple revisions of the same unit
of information.
It is most commonly used in software development to manage ongoing
development of digital documents like application source code and other
critical information that may be worked on by a team.
They are then allowed to modify the file and finally check in it.
Only one user can check out a file at a time.
www.valuelabs.com
71
Bugs
And
Bug Reporting
www.valuelabs.com
72
Contents
Bug
What is a bug?
Cost of bug
Bug reporting
Bug Report
Summary
73
BUG
First used because a moth flew into a vacuum-tube computer and was a
cause for a system break.
First computer bug was a moth found trapped between points at Relay # 70,
Panel F, of the Mark II Aiken Relay Calculator (a vacuum-tube computer)
while it was being tested at Harvard University, 9 September 1947. The
operators affixed the moth to the computer log, with the entry: "First
actual case of bug being found". They put out the word that they had
"debugged" the machine, thus introducing the term "debugging a computer
program".
www.valuelabs.com
74
What is a Bug?
an unexpected item, a diversion from the expected result, a design which is not as
per requirements, a text which is wrong, an incorrect value, a fault in data
validation, a missing line etc
Bug is the out come of software testing, which is the main source to ensure
product stability
www.valuelabs.com
75
$100
$80
$60
$51
$40
$7.50
www.valuelabs.com
Maintenance
$3.50
Coding
$1
Design
$0
Testing
Cost
$20
Requirement
76
The high priority bugs should be addressed first and fixed to release the
product/application
www.valuelabs.com
77
Need more info: When the developer needs information to reproduce the bug
or to fix the bug
Reopened: Bug still exists even after the bug is fixed by the developer
www.valuelabs.com
78
Bug reporting
The aim of a bug report is to enable the programmer to see the program
failing in front of them, by giving them careful and detailed instructions on
how to make it fail. If they can make it fail, they will try to gather extra
information until they know the cause. If they can't make it fail, they will
have to ask you to gather that information for them.
www.valuelabs.com
79
Bug Report
To write a fully effective report you must:
- Explain how to reproduce the problem
- Analyze the error so you can describe it in a minimum number of steps.
-
www.valuelabs.com
80
Will help in keeping track of the issues and helps in stabilizing the system
www.valuelabs.com
81
Steps to reproduce
Reproducibility frequency
Severity
Environment details
Screenshots
www.valuelabs.com
82
www.valuelabs.com
83
Summarize: relate test to customers (Put a short tag line on each report)
www.valuelabs.com
84
Yes
No
www.valuelabs.com
85
www.valuelabs.com
86
Product Owners / Managers / Leads may change the priority using their
judgment.
www.valuelabs.com
87
Summary
Write clearly. Say what you mean, and make sure it can't be misinterpreted.
www.valuelabs.com
88
A tool that facilitates the recording and status tracking of incidents (or
issues/bugs) found during testing is a bug tracking/incident tracking tool.
They often have workflow-oriented facilities to track and control the
allocation, correction and re-testing of incidents and provide reporting
facilities.
Different bug tracking tools :
Bugzilla
QTrack
Teamtrack
Test Director
Fogbugz
www.valuelabs.com
89
www.valuelabs.com
90