You are on page 1of 15

Site version 2.0/2.

Test Plan

Version 0.95

March, 13, 2006


Revisions
Version Primary Author(s) Description of Version Date
Completed
0.5 Roumen Banov First draft of the Site 2.0 test plan 01/30/2005
0.6 Mitko Mitev First review 01/31/2005
0.7 Roumen Banov Second draft 02/13/2005
0.8 Mitko Mitev Second review 02/13/2005
0.9 Roumen Banov Third draft (includes Functional testing section) 02/14/2005
0.91 Vladimir Alexiev Improved formatting, reworked section Testing Resources 02/14/2005
0.92 Roumen Banov Final review (includes updated sections 4.7 & 4.8) 02/14/2005
0.93 Roumen Banov Update after the first QA session with Client team 02/23/2005
0.94 Roumen Banov Update after the second QA session with Client team 03/11/2005
0.95 Roumen Banov Update after Client team clarifications 03/13/2005

Document Approval
If appropriate, this section contains formal sign-off for both review and approval of the test plan.
Position Name Signature Date
Author Roumen Banov
Quality Manager Svetla Vyrbanova

Contents
1.1 GOALS............................................................................................................................1
1.2 ASSUMPTIONS.................................................................................................................1
1.3 RISKS AND ASSETS.........................................................................................................1
1.4 REFERENCES...................................................................................................................1
4.1 VOLUME TESTING...........................................................................................................2
4.1.1 Objectives...........................................................................................................2
4.1.2 Processing..........................................................................................................2
4.1.3 Completion Test Criteria....................................................................................3
4.1.4 Test Reports........................................................................................................3
4.2 CROSS BROWSER TESTING.............................................................................................3
4.2.1 Objectives...........................................................................................................3
4.2.2 Processing..........................................................................................................3
4.2.3 Completion Test Criteria....................................................................................4
4.2.4 Test Reports........................................................................................................4
4.3 SECURITY TESTING.........................................................................................................4
4.3.1 Objectives...........................................................................................................4
4.3.2 Processing..........................................................................................................4
Page:
i of 16
4.3.3 Completion Test Criteria....................................................................................5
4.3.4 Test Reports........................................................................................................5
4.4 INSTALLATION TESTING/DOCUMENTATION.....................................................................5
4.4.1 Objectives...........................................................................................................5
4.4.2 Processing..........................................................................................................5
4.4.3 Completion Test Criteria....................................................................................6
4.4.4 Test Reports........................................................................................................6
4.5 USER ACCESS TESTING...................................................................................................6
4.5.1 Objectives...........................................................................................................6
4.5.2 Processing..........................................................................................................6
4.5.3 Completion Test Criteria....................................................................................7
4.5.4 Test Reports........................................................................................................7
4.6 END TO END INTEGRATION TESTING..............................................................................7
4.6.1 Objectives...........................................................................................................7
4.6.2 Processing..........................................................................................................7
4.6.3 Completion Test Criteria....................................................................................9
4.6.4 Test Reports........................................................................................................9
4.7 SPECIFICATION ADHERENCE...........................................................................................9
4.8 FUNCTIONAL TESTING....................................................................................................9
4.8.1 Objectives...........................................................................................................9
4.8.2 Scope..................................................................................................................9
4.8.3 Processing..........................................................................................................9
4.8.4 Completion Test Criteria..................................................................................10
4.8.5 Test Reports......................................................................................................10
5.1 STAFFING......................................................................................................................11
5.2 SCHEDULE.....................................................................................................................11
5.3 RESOURCES...................................................................................................................11

Page:
ii of 16
1 Introduction
This document summarizes all testing activities related to the product Site 2.0/2.1.

1.1 Goals
The goal of this test plan is to ensure optimal coverage and validation of all functional as well as non-
functional requirements described in the updated document Site 2.0 Site specification.

1.2 Assumptions
A1: This test plan is based on the updated document Site 2.0 Site specification ver 1.92. Any discrepancies
registered during the testing phase between this specification and the real software will be considered as
software defects.
A2: Requirements missing in the specification will be discussed with Client team and if necessary added
before completing this Test Plan. Requirements not included in the updated Specification by then will be
considered out of scope of this Test Plan. The final iteration (update from QA team and review from Client
team) should be done after the Site product specification freezing.

1.3 Risks and Assets


The Client team set a specific requirement about the testing methodology all testing activities (test case
design, execution and reporting) should be followed strictly the development of the separate parts or
components of the Site product (for example, administration part user management component, etc).
Using this methodology there can be QA resource shortage the work of the QA engineer may be not so
efficient because he must switch daily tasks between the tests execution and reporting for the existing (fully
implemented) parts or components and at the same time test case design for new (not fully implemented)
parts or components.

1.4 References
The Test Plan has derived its content from the updated document Site 2.0 Site specification ver 1.92.

2 Features to Be Tested
The following features and functions of the product Site 2.0, described in the updated document Site
Version 2.0 Specification should be tested:
Administration (user management, site management, global reports, encoder management);
Communications management;
Communications Assets management (player, streams, slides, pages, etc.);
Actors permissions (global, manager, editor, author, viewer, presenter);
Communications Assets management (images);
Basic Reporting (communication level, communication assets level);

Page:
1 of 16
3 Features Not To Be Tested
The following features and functions of the product Site 2.0, described in the updated document Site
Version 2.0 Specification should not be tested:
Extended Reporting (communication level, communication asset level, global, audit) it will include
in the next release;

4 Approach
The software testing often occurs in two phases during the development of an item: development testing and
system testing phases. The System testing occurs after the item has been released by the developer from
development testing. In this document there are using these two terms/definitions.
The required types of testing for the product Site version 2.0, identified at the moment are:
Volume Testing
Cross Browser Testing;
Security Testing;
Installation Testing/Documentation;
User access Testing;
End to End integration testing;
Specification adherence;
Functional testing;
All these types of testing are prioritized, correspond to the Client team requirements.
All requirements from the document V2 QA Testing Outline are included in the appropriated sections in
this document.

4.1 Volume Testing


4.1.1 Objectives
The purpose of this task is to verify the correctness of the implementation of the client requirements
(maximal limitations) for the performance (load/volume) testing in the product Site version2.0/2.1.
4.1.2 Processing
The process for the volume testing of the product Site version 2.0/2.1 includes the following subprocesses:
Technology research for possible volume capabilities on the current product server limitations:
Technology research for the appropriated performance automated testing tools;
Volume testing execution it should deliver results for:
o Maximum number of concurrent live audience viewers (per product server);
o Maximum number of concurrent on-demand audience viewers (per product server);
o Maximum number of active encoders the system can support (per product server);
o PowerPoint size limitations;
o Simultaneous Slide uploads;

Page:
2 of 16
Initially in this volume testing the QA team will watch closely on the gross parameters (response time,
transactions, time of the first byte (TTFB), time of the last byte (TTLB), etc.). The QA team may prepare the
appropriated metrics for the volume testing of the Site product server (memory % usage, CPU % usage, etc.)
The testing environment should be included the product test server with the required software (correct
version of mySQL, .NET and C++ components, etc.) and the client workstation with the required software
(operating system, browser, players, etc.).
All the system testing tasks (described above) should be performed from the QA team. All test results should
be reviewed from the company QM.
4.1.3 Completion Test Criteria
The completion test criteria should be a successful execution of the volume testing, including the required
information for the maximal limitations of the product Site version 2.0/2.1 Server.
4.1.4 Test Reports
The reports that should be generated by the volume testing are:
Summary test report (max server limitations and other parameters of the product server);
Deliver results in terms of volume capabilities on the current system limitations currently are
(according to the current server specification etc);

4.2 Cross Browser testing


4.2.1 Objectives
The purpose of this type of testing is to verify the correct work of different types/versions of the used
browsers with the product Site version 2.0 with the required versions of the MS Media Player, Real Player,
etc.
4.2.2 Processing
The process for the cross browser testing of the product Site version 2.0/2.1 includes the following
subprocesses:
Setup the appropriated testing environment (client workstations with different type/versions of the
used browsers and the required players);
Cross browser testing with the following versions/types:
o Microsoft IE 5.5 and above;
o Mozilla Firefox 1.0;
o Minimal Resolution size should be 800 x 600;
o Operating systems should include MS NT, 2000 XP + SP2 (required), Linux;
The testing environment should be included product test server with the required software (correct version of
mySQL, .NET and C++ components, etc.) and client workstation with required software (browsers MS IE
5.5 and above, Mozzilla/Firefox 1.0, MS Media player, Real player, etc).
The product Site version 2.0 should be tested for following compatibility:

Browser IE 5.5 + Mozilla/Firefox1.0

Microsoft Media Player 6.4+ 6.4+


Page:
3 of 16
Real v8+ v8+
Java Script yes yes
Initially the QA team will focus on the application.
Later (when the BT team have adapted them) the QA team should be focused also on the player and lobby
pages.
All the system testing tasks (described above) should be performed from the QA team. All test results should
be reviewed from the company QM.
4.2.3 Completion Test Criteria
The completion test criteria should be a successful execution of the normal flow in the product Site version
2.0/2.1 for all required types/versions of the used browsers with the appropriated players.
4.2.4 Test Reports
The reports that should be generated by the cross browsers testing are:
Summary test report (for the MS browsers);
Summary test report (for Mozilla browser);
Summary test report (for Linux operating system);

4.3 Security testing


4.3.1 Objectives
The purpose of the security testing is to verify the correctness of the implementation of the site and user
management components, correctness and consistency of the global information, correctness of the
implementation of product injections when necessarily, etc.
4.3.2 Processing
In the product Site 2.0/2.1 it will be HTTP protocol used.
In the security testing should be included the following testing tasks:
Checking the correctness of the implementation of the site and user management;
Checking for possible crossing actions between different resellers it is required these actions
should be not crossed;
Checking the correctness and consistency of the global information;
All the system testing tasks (described above) should be performed from the QA team. All test results should
be reviewed from the company QM.
4.3.3 Completion Test Criteria
The completion test criteria should be a complete execution (without any major issue) for all required testing
tasks (described above) in the product Site version 2.0/2.1
4.3.4 Test Reports
The reports that should be generated by the security testing are:
Summary test report (for the executed testing tasks and checked components);

Page:
4 of 16
4.4 Installation Testing/Documentation
4.4.1 Objectives
The purpose of this task is to verify the completeness and correctness of the installation procedure and
product documentation for the product Site version 2.0/2.1 and is to verify the content of the technical
documentation for clear, understandable description, correspond to the client requirements. The main
purpose in this task is to ensure efficient knowledge transfer (in product details) from the development team
to the Client and support teams.
4.4.2 Processing
The process for the installation testing of the product Site version 2.0/2.1 includes the following
subprocesses:
Familiarize with the described installation procedure;
Check the prerequisites for the installation process (for example, required service packs for
the operating system) in the testing environment;
Checking Encoder status in the testing environment;
BSlider;
Run the installation procedure step-by-step;
Report all identified issues/bugs during the installation testing;
Verify the fixed issues/bugs in the installation procedure (re-run the updated procedure);
The Client team will document the production environment that it should be simulated for testing purposes.
The process for the documentation testing of the product Site version 2.0/2.1 includes the following
subprocesses:
Collect all available technical documents from the project repository;
Verify the content of all technical documents against the real implementation in the product
Site version 2.0 (DB tables, stored procedures, views, user interface, pages, menus, etc.);
Report all identified issues during the testing of technical documentation;
Verify the fixed issues in the technical documentation (review of the updated technical
documentation);
All the system testing tasks (described above) should be performed from the QA team. All test results should
be reviewed from the company QM.
4.4.3 Completion Test Criteria
The completion test criteria for the installation testing should be a successful execution of the installation
procedure without any issues/buds during the installation process of the product Site version 2.0/2.1.
The completion test criteria for the documentation should be a full (without major issues and notes)
conformity of the technical documentation of the product Site version 2.0/2.1with the client requirements,
implemented solutions and used standards during the project lifecycle.
All identified issues during the installation testing/documentation are fixed from the developers and verified
from the QA team (re-run the updated installation procedure).
4.4.4 Test Reports
The Test report for the installation testing should be a checklist with all installation procedure steps with the
name of the QA engineer, step status (passed/failed), comments and notes.
Page:
5 of 16
The Test report for the documentation should be a checklist with all technical documents (name, short
description, document status, name of the QA engineer, comments and notes).

4.5 User Access Testing


4.5.1 Objectives
The purpose of this task is to verify the correctness of the implementation of the client requirements about
the users access permissions in the product Site version 2.0/2.1.
4.5.2 Processing
The process for the user access testing of the product Site version 2.0/2.1 includes the following
subprocesses:
Familiarize with the user access rights requirements;
Prepare the test scenario for all roles and combinations in the product Site version 2.0:
o Single user access:
Presenter only (only access a presenter page)
Viewer only (only view site content no editorial)
Author only (only view, edit, present own objects)
Editor only (view, edit, present all site objects)
Manager only (User admin)
Site level user access combinations:
Viewer and Presenter (only view site content no editorial and present)
Viewer and Author (view site content edit, present own objects)
Viewer, Author and Manager (view site content edit, present own objects and
admin)
Editor and Manager (view, edit, present all site objects and admin)
Global level user access combinations:
Presenter
Viewer
Author
Editor
Manager
Viewer and Presenter
Viewer and Author
Viewer and Author and Manager
Editor and Manager
o Delete user (different types and combinations);
o Transfer object ownership between different user types;
o Test implications for transferring a communication to an author containing assets that the
author does not have editorial access to (the author should go through to underlying asset
overview page with no editorial control);
The users access testing will be a part of the functional testing of the product Site 2.0/2.1.
With a purpose to guarantee strictly testing of all users access permissions over all product pages from the
QA team and easy verification from the Client team it will be prepared a specific matrix (Users access
Matrix) with the following three dimensions:
Page:
6 of 16
The rows are all users in the Site product (author, editor, manager, etc);
The columns are all access level (single, site, global) for the actors;
The content of the matrix cells are all permitted actions for specific actor with definite access level
(add, edit, view, delete, etc);
The testing environment should be included the product test server with the required software (correct
version of mySQL, .NET and C++ components, etc.) and the client workstation with the required software
(operating system, browser, etc.).
All the system testing tasks (described above) should be performed from the QA team. All test results should
be reviewed from the company QM.
4.5.3 Completion Test Criteria
The completion test criteria should be a successful execution of the user access testing for the normal flow in
the product Site version 2.0/2.1, according to the required users permissions for the product.
4.5.4 Test Reports
The reports that should be generated by the user access testing are:
Summary test report (Users access Matrix);
Summary test report (delete user and transfer object ownership);

4.6 End to End Integration testing


4.6.1 Objectives
The purpose of this task is to verify the correctness of the implementation of the end to end workflow in the
testing environment for product Site version 2.0/2.1. The described workflow scenarios will be used as
training materials for the customer.
4.6.2 Processing
The process for the end to end integration testing of the product Site version 2.0/2.1 includes the following
subprocesses:
Prepare the appropriated workflow scenario for initial end to end testing:
o Streams On demand:
Create Site
Create Users
Create WMP Stream Asset (multi-band widths)
Create Real Stream Asset (multi-bandwidths)
Create Player Asset
Create Slide Asset
Create Communication
Ensure during Pending players are not viewable by outside world
Ensure during Published players are viewable by outside world
Ensure during Pending players are not viewable by outside world
Delete Communication
Delete Assets
Delete User
Delete Site

Page:
7 of 16
oStreams Live to On demand:
Create Site
Create Users
Configure encoder
Create WMP Stream Asset (multi-bandwidths) (Schedule Encoder)
Create Real Stream Asset (multi-bandwidths) (Schedule Encoder)
Create Player Asset
Create Slide Asset
Create Communication
Attached Input device
Monitor Encoders
Ensure during Pending players are not viewable by outside world
Ensure during Published players are viewable by outside world
User Presenter Screen to control slides
View event as audience members (view simultaneous slides) via different formats
and bandwidths).
Convert to on-demand (using saved file sizes)
Ensure during Pending players are not viewable by outside world
Ensure during Published players are viewable by outside world
Ensure during Pending players are not viewable by outside world
Delete Communication
Delete Assets
Remove encoder
Delete User
Delete Site
Workflow scenario execution:
o On demand streams;
o Live to on demand streams;
The testing environment should be included the product test server with the required software (correct
version of mySQL, .NET and C++ components, etc.) and the client workstation with the required software
(operating system, browser, players, etc.).
All the system testing tasks (described above) should be performed from the QA team. All test results should
be reviewed from the company QM.
4.6.3 Completion Test Criteria
The completion test criteria should be a successful execution of the end to end integration testing for the
normal flow in the product Site version 2.0/2.1 for different types of the streams (on demand and live).
4.6.4 Test Reports
The reports that should be generated by the end to end integration testing are:
Summary test report (including on demand workflow);
Summary test report (including live to on demand workflow);

Page:
8 of 16
4.7 Specification adherence
The main purpose of this task is to ensure that the ongoing QA to ensure each new release adheres to the
document Site specification 1.93 (up to the point intended by the developers).

4.8 Functional Testing


For the product Site version 2.0/2.1 there are planned the following types of testing:
Development testing (test cases suite execution);
System testing (test scenario execution);

4.8.1 Objectives
The main objective of the functional testing task is to verify the correctness of the implemented functionality
in the product Site version 2.0/2.1 against the client requirements (described in the updated document Site
version 2.0 Site specification) and used industry standards during the project lifecycle.
4.8.2 Scope
All identified main parts of the product Site version 2.0 will be tested:
Manage Communications;
Manage Communications Assets;
Basic Reporting;
Administration;
The set of the designed test cases will be tested the whole normal flow functionality (with valid input data) +
some typical exceptions (for example, incorrect password during login process, etc). Every test case has
unique ID and will describe a simple action (event) over the product objects (player, stream, user, etc.) on
specific product page for example, search communication, create user, etc. All test cases, related to specific
component (for example, site management) should be in one test scenario, included all possible actions (test
cases) for the scenario object (for example, site management scenario should include test case add new;
edit existing; view; delete, etc).
4.8.3 Processing
The functional testing of the product Site 2.0/2.1 includes the following subprocesses:
Test plan iterations (preparation and review; update and review);
Test cases design and review;
Test cases update (synchronization with the updated specification Site 2.0 1.93);
Development testing - test scenario/cases execution and test results reporting;
Development testing fixed bugs/issues verification;
System testing separate workflow execution for all users and test results reporting;
System testing fixed bugs/issues verification;
The testing environment should be included product test server with the required software (correct version of
mySQL, .NET and C++ components, etc.) and client workstation with the required software (browsers MS
IE 5.5, Mozzilla/Firefox 1.0, players, etc).

Page:
9 of 16
All development testing tasks should be performed from the QA team; the system testing tasks should be
performed from the QA team under daily monitoring of the QM. All test results should be reviewed from the
company QM.
4.8.4 Completion Test Criteria
The completion test criteria should be a checklist with a suite of all test scenarios (include all designed test
cases and all possible users of the product) and this checklist should be completed (step-by-step) from the
QA team. The successfully completion of functional testing should be a trigger to starting of the deployment
process of the Site product.
4.8.5 Test Reports
The reports for the functional testing that should be generated by the product testing process are:
Bugs/Issues summary report a full list of all identified bugs/issues during the developer/system
testing, their severity/prioritization, their current status (identified, in progress, fixed, verified, etc.);
Summary test reports total number of test cases/scenario; % test cases executed; % test cases
passed; comments and notes for the failed test cases, etc.;

Page:
10 of 16
5 Testing Resources, Schedule
This section describes significant resources of the testing effort. It items such as roles that are involved in
testing on the project, the responsibilities of each, staffing information, etc.

5.1 Staffing
QA staffing of the project, including roles and responsibilities of individual QA roles:
Name Role Involvement Responsibility Skills
Roumen QA Lead part-time Prepare test plan, provide Quality Manager 2+ years
Banov direction for and review test experience in project and
cases; participant in the meetings team management and 7+
with the BT team; report years experience in
identified bugs/issues, control commercial projects as QA
tests execution and reporting Lead and QA Engineer
Petia QA full time Design and execute test cases; QA Engineer with 4+ years
Bojanova Engineer report identified bugs/issues; experience in commercial
verify fixed bugs; check projects in different
installation procedure and business areas
technical documentation;
responsible for the volume
testing
Valentin QA part time Design and execute test cases; QA Engineer trained in the
Chernev Engineer report identified bugs/issues; course Basics for Software
verify fixed bugs; Testing - Foundation level
with good understanding of
QA methodology
Svetla QM part-time Review all test documents (test Company Quality Manager
Vyrbanova plan, test cases, test reports, etc.) with long experience in
and controls the daily activities commercial projects and
of the QA team QA team management
Note: all QA staff will be charged at the same hourly rate, regardless of qualifications.

5.2 Schedule
The test milestones of the project depend on the project timeline these milestones should be synchronized
with the main development tasks. The prioritization of the QA activities is required from the Client team
focus on the non-functional testing (section 4 in this document see page 6).
The new testing methodology (QA activities should be strictly followed the development tasks progress)
required strongly and precisely coordination between the development and QA teams efforts and perfect
synchronizations between the development and QA milestones and deliverables.

5.3 Resources
Resources allocated to the test effort (e.g. equipment, lab, software, etc.).
Resource Name, version
Page:
11 of 16
Computer (processor, memory, hard disk,
peripheries, etc.)
Operating system
Environment to develop functional tests
Environment to develop unit tests
Environment to contain documents, program and test
code
Bug tracking system used
Schedule development environment

Note: If the environment and tools used in the testing process are part of project plan description, there is no
need to be described here.

Page:
12 of 16

You might also like