You are on page 1of 39

XXXX

Test Plan For Phase 1

DD-MM-YY/ Version
Revision History
Versio Date Author( Reviewer(s) Change Description
n s)

Copyright Information
This document is the exclusive property of XXX Corporation (“XXX”); the recipient agrees that he/she may not copy,
transmit, use or disclose the confidential and propriety information set forth herein by any means without the
expressed written consent of XXX. By accepting a copy hereof, the recipient agrees to adhere to and be bound by
these conditions to the confidentiality of XXX's practices and procedures; and to use these documents solely for
responding to XXX’s operations methodology. All rights reserved XXX Corporation, 2000. XXX IT reserves the right to
revisit Business Requirements and Functional Specifications Documents if approval to proceed is not received within
90 days of the issue date.
Test Plan XXX

TEST PLAN XXX ............................................................................................................................. ..............3

1 INTRODUCTION........................................................................................................................................ ...7

2 TEST PLAN APPROVAL PROCESS.................................................................................... .......................7

3 ASSOCIATED & REFERENCED DOCUMENTATION............................................................................... ..7

3.1 Framework, Elements, Events, & User Flows...........................................................................................................7

3.2 Testing Project Plans....................................................................................................................................................7

3.3 Test Plans and Test Scripts..........................................................................................................................................7

4 TESTING STRATEGY........................................................................................................... .......................8

4.1 Scope..............................................................................................................................................................................8

4.2 Testing Approach.........................................................................................................................................................8

4.3 Test Setup......................................................................................................................................................................9

4.4 Test Environment Details..........................................................................................................................................10

4.5 Responsibilities...........................................................................................................................................................11

4.6 Risks that impact testing...........................................................................................................................................13

4.7 Test Suspension/Resumption Criteria......................................................................................................................13

4.8 Test Stop Criteria.......................................................................................................................................................13

5 TEST CASE DESIGN AND DEVELOPMENT................................................................ ............................14

5.1 Test Case Design Instructions...................................................................................................................................14

5.2 Test Case Design Deliverables...................................................................................................................................14

6 TESTING TEAM................................................................................................................ .........................15

6.1 Core Team...................................................................................................................................................................15

6.2 Technical Support Team............................................................................................................................................15

6.3 Assumptions, Constraints, and Exclusions..............................................................................................................15

7 TESTING TOOLS................................................................................................................. ......................16

7.1 Testing Tools...............................................................................................................................................................16

XXX DD-MM-YY Page 3 of 39


Test Plan XXX

7.2 Assumptions, Constraints, and Exclusions..............................................................................................................16

8 KEY EXTERNAL DEPENDENCIES................................................................................... ........................17

8.1 Assumptions, Constraints, and Exclusions..............................................................................................................17

9 METRICS COLLECTION...................................................................................................................... ......18

10 CLASSIFICATION OF ISSUES.......................................................................................... ......................19

10.1 Assumptions, Constraints, and Exclusions............................................................................................................19

11 UNIT TESTING................................................................................................................ .........................20

11.1 Purpose......................................................................................................................................................................20

11.2 Responsibility............................................................................................................................................................20

11.3 Environment.............................................................................................................................................................20

11.4 Exit Criteria..............................................................................................................................................................20

11.5 Assumptions, Constraints, and Exclusions............................................................................................................20

12 INTEGRATION TESTING..................................................................................................... ....................21

12.1 Purpose......................................................................................................................................................................21

12.2 Responsibility...........................................................................................................................................................21

12.3 Environment.............................................................................................................................................................21

12.4 Test Data...................................................................................................................................................................22


12.4.1 Mainframe test data.............................................................................................................................................22
12.4.2 Local test data.....................................................................................................................................................22

12.5 Test Execution Process.............................................................................................................................................22

12.6 Exit Criteria..............................................................................................................................................................22

12.7 Additional Information............................................................................................................................................23

12.8 Assumptions, Constraints, and Exclusions...........................................................................................................23

13 SYSTEM (QA) TESTING........................................................................................................ ..................24

13.1 Purpose......................................................................................................................................................................24

13.2 Responsibility...........................................................................................................................................................24

13.3 Environment.............................................................................................................................................................24

XXX DD-MM-YY Page 4 of 39


Test Plan XXX

13.4 Test Data...................................................................................................................................................................25


13.4.1 Mainframe test data.............................................................................................................................................25
13.4.2 Local test data.....................................................................................................................................................25

13.5 Test Execution Process.............................................................................................................................................25

13.6 Exit Criteria..............................................................................................................................................................26

13.7 Additional Information............................................................................................................................................26

13.8 Assumptions, Constraints, and Exclusions............................................................................................................27

14 MAINFRAME TESTING...................................................................................................................... ......28

14.1 Purpose......................................................................................................................................................................28

14.2 Responsibility...........................................................................................................................................................28

14.3 Environment.............................................................................................................................................................28

14.4 Test Execution Process.............................................................................................................................................28

15 LOAD TESTING............................................................................................................. ..........................29

15.1 Purpose......................................................................................................................................................................29

15.2 Scope..........................................................................................................................................................................29

15.3 Responsibility...........................................................................................................................................................29

15.4 Environment.............................................................................................................................................................29

15.5 Testing Methodology................................................................................................................................................30


15.5.1 Load testing.........................................................................................................................................................30
15.5.2 Endurance testing................................................................................................................................................31
15.5.3 Planned testing cycles.........................................................................................................................................32

15.6 Metrics to be measured...........................................................................................................................................32

15.7 Test Deliverables.......................................................................................................................................................33

15.8 Assumptions, Limitations and Constraints............................................................................................................33

15.9 Exit Criteria..............................................................................................................................................................33

16 REGRESSION TESTING................................................................................................................... .......35

16.1 Purpose......................................................................................................................................................................35

16.2 Responsibility...........................................................................................................................................................35

16.3 Environment.............................................................................................................................................................35

XXX DD-MM-YY Page 5 of 39


Test Plan XXX

16.4 Exit Criteria..............................................................................................................................................................35

17 USER ACCEPTANCE TESTING.......................................................................................... ....................36

17.1 Purpose......................................................................................................................................................................36

17.2 Responsibility...........................................................................................................................................................36

17.3 Environment.............................................................................................................................................................36

17.4 Test Data..................................................................................................................................................................36

17.5 Test Execution Process.............................................................................................................................................36

17.6 Exit Criteria..............................................................................................................................................................37

17.7 Additional Information............................................................................................................................................38

17.8 Assumptions, Constraints, and Exclusions............................................................................................................38

18 SOFT LAUNCH.............................................................................................................. ..........................39

XXX DD-MM-YY Page 6 of 39


Test Plan XXX

1 Introduction
XXX is in the process of re-engineering and re-designing the XXX.com website. XXX has been working with
XYZ Private Limited to complete the architectural and detailed design of the new XXX.com site. At this
time the development phase of the project is underway and the site launch is planned for March 2005.

Quality Assurance Testing is the joint responsibility of the XYZ and Business team from Budget. The
purpose of this document is to provide an overview of the testing process for the XXX.com project. This
document will be distributed to all Project Managers for review. It is the responsibility of the Project
Managers to distribute this document to the appropriate team members for review where necessary.

2 Test Plan Approval Process


This document will go through two weeks of review and sign off process after the submission of the
document to the XXX IT and XXX.com business group. During these two weeks the reviewers can provide
their comments in the test plan document to ABC (abc@XYZ.com). Please use track changes in the
document and indicate your feedback comments. One week after the submission of this document the
Test lead will schedule for a detailed walkthrough of the test plan and discuss about the review
comments. The final sign off (of this test plan) will be scheduled at the end of the two-week review
cycle. Any changes after the test plan sign off should go through the formal Change Request process (CR).
Refer to the share point portal for the detailed CR process overview and the templates for the submission
of CR.

Please refer following document in Sharepoint for team structure of eCommerce XXX.com Redesign
project:

PMO Documents/Core PMO Documents/Roles and Responsibilities.doc.

3 Associated & Referenced Documentation


3.1 Framework, Elements, Events, & User Flows

The foundation of the Integration and System Testing Plans and Scripts are based upon the information
provided in the signed of sections of the Booking Engine Use Cases EBR - Budget.com, Non-Booking
Engine Use Cases XXX.com Redesign EBR Page Specifications for Ecommerce XXX.com Redesign,
Non-Functional Requirements XXX.com Redesign EBR, External interfaces document and Data feeds,
Technical Description – Query String Parameters and Splash Pages list. Any changes to these documents
will follow the appropriate channels of the Change Control Board. Once changes are approved, the test
plans and test scripts will be modified accordingly.

3.2 Testing Project Plans

The detailed testing project plan is part of the master project plan.

3.3 Test Plans and Test Scripts

As test plans and test scripts are completed and assigned a version number, they will be placed in
Sharepoint portal under Test Documents folder. As plans and scripts are completed or modified,
notification will be sent to the appropriate individuals. Scripts created for the purpose of Testing will be
placed in the above referenced Test Documents folder under Integration, QA and UAT.

XXX DD-MM-YY Page 7 of 39


Test Plan XXX

4 Testing Strategy
4.1 Scope

The following types of testing will be conducted for this project:

• Unit

• Integration

• System

• Regression

• Load

• User Acceptance

Participation from all development areas will be required. Each Development Project Plan should account
for development participation in each phase of testing. It is anticipated that the level of developer
involvement will decrease as the testing progresses.

4.2 Testing Approach

• Manual test script generation will be the preferred method until such time that the Testing
Project Manager determines that site stability is adequate for automated script creation.

• A “two pass” per iteration approach will be used.

• A-Pass: focuses on “normal” conditions to ensure all parts of the application are working in a
normal test script. Immediate identification of major issues is required.

• B-Pass: focuses on “exception” conditions to ensure boundary conditions, error handling, and etc.
are working correctly. Immediate identification of major issues is required.

• Four (4) types of test scripts will be created

• Normal (N): Test scripts that test the expected behavior under normal, or “pass” conditions.

• Exception (E): Test scripts that test the expected behavior under exception, or “fail”
conditions.

• Data Normal (DN): Test scripts that test the expected behavior under data-specific normal
conditions.

• Data Exception (DE): Test scripts that test the expected behavior under data-specific
exception conditions.

• Iteration: One (1) complete end-to-end A-Pass and one (1) complete end-to-end B-Pass across all
modules.

XXX DD-MM-YY Page 8 of 39


Test Plan XXX

• Number of iterations for Unit Testing to be determined by Development Project Manager(s)

• Unit test script creation and execution is the responsibility of the development staff(s)

• Number of iterations for Integration Testing will be on an as needed basis within the Integration
Testing Cycle. This will be determined by the Testing Project Manager and Customer Application
Project Manager(s) during the Integration Testing Cycle.

• Integration Testing will include all (N) test scripts during the A-Pass and (E) test scripts during the
B-Pass.

• Number of iterations for System Testing will consist of up to three. Should issues arise that justify
additional iterations, the testing timeline will increase by five (5) days per iteration.

• System Testing will include all (N) & (DN) test scripts during the A-Pass and (E) & (DE) test scripts
during the B-Pass

• User Acceptance Testing: Test scripts to be created by QA team with the help of Business Analyst
and User Acceptance Group (Business).

• Load Testing will consist of a select group of (N) scripts that accurately represent a cross section
of functionality against a predetermined load.

• Regression Testing will be created from the (N), (DN), (E), & (DE) test scripts.

4.3 Test Setup

The diagram below gives a high level overview of the proposed System.
Browser

Host 3
Host 1
TeamSite
Web server

(iPlanet 7.1)

Open Deploy

Host 2

App server (WebLogic 8.1)

Personalization
MUX app.

Host 4

App DB Person
App DB
DB
(Oracle
9i)
(Oracle

Nightly
XXX feed DD-MM-YY Page 9 of 39
Test Plan XXX

4.4 Test Environment Details

EBR Test Environment Details


Dates Phase Budget.com Mainframe

Budget Highway
Until 11/12 Unit Testing Local Box
Test

Integration Integration Budget Highway


11/15 to 11/30
Testing Environment ATR

Budget Highway
11/15 to 3/30 Bug fixes Local Box
ATR

Budget Highway
12/1 to 1/14 QA Testing QA Environment
ATR

Testing by
Business
Budget Highway
12/13 to 1/14 (M/F transactions QA Environment
for Rates & ATR
Reservation)

Production Budget Highway


1/17 to 2/11 QA Testing
Environment ATR

Budget Highway
1/17 to 2/11 Limited UAT QA Environment
ATR

Production Budget Highway


2/14 to 3/4 Final UAT
Environment ATR

Production Budget
3/7 to 3/18 Soft Launch
Environment Production

Limited testing

XXX DD-MM-YY Page 10 of 39


Test Plan XXX

4.5 Responsibilities

No. Area/Section Responsible for Next Steps / Important Notes


testing

1 Functionality based on use Ensure that test cases cover all requirements listed in
cases -- Booking engine, signed-off documents - Booking Engine, Non-booking
non-booking engine, BCD Engine, and NFRs.
admin tool, non-
functional requirement

2 External data feeds Steven to produce necessary documentation that covers


all aspects of data feeds and their processing (for e.g.,
CRON job names, where (servers), their schedules etc.
This needs to be done for both QA & production.

3 Splash pages hosted under Review the list of splash pages and include them in the
XXX.com for partners overall testing project plan.

4 Requests from other sites 1. Check with the business if there is a master list with
with specific URL the list of external websites from where XXX.com website
parameters to the is invoked.
XXX.com website 2.Discuss with the technical team and business on the list
of URL parameters that will be supported in the new
XXX.com webs

5 Testing of the static Business needs to complete this list. This activity will
content pages in the site start in the month of November. Review the list of static
content pages prepared by business and include it in the
overall testing project plan.

6 Fast break front end Amit to have a preliminary discussion with Hans to
application and the admin understand the functionalities. Request Hans to create
tool basic test scenarios. Amit to include "load test" of Fast
Break in test planning.

7 Indigio managed admin Ask Indigio team to come up with test plan, test cases and
tools -- affiliate include them in the overall testing project plan. Planning
management tool, to get test results and updates during the testing phase.
location correction admin
tool

8 Testing of the new Alfredo confirmed that the mainframe team will perform
modified mainframe the unit testing and QA for all mainframe transaction
transactions changes (PSR items).

9 Regression testing of the Alfredo confirmed that the mainframe group will perform
all the mainframe the regression testing of all mainframe transactions that
transactions are used in Budget.com.

XXX DD-MM-YY Page 11 of 39


Test Plan XXX

No. Area/Section Responsible for Next Steps / Important Notes


testing

10 Sending Emails and Email Amit to pass on relevant XXX.com test cases to E-Dialog.
campaign management - For e.g., Reservation Confirmation, Reservation
E-dialog Reconfirmation emails being sent etc. Get their validation
on the test case.

11 Reporting -- basic testing Review the tagging requirements from Indigio and also
include them in the master testing project plan
12 Reporting -- extensive Ask Indigio team to come up with test plan, test cases and
testing including analytics include them in the overall testing project plan. Planning
reports to get test results and updates during the testing phase.

13 Outage component Need to discuss with the technical team and also the IBM
further regarding the testing. Planning to get test results
and updates during the testing phase.

14 True north (mapping tool) Ask Indigio team to come up with test plan, test cases and
with misspelling include them in the overall testing project plan. Planning
corrections. to get test results and updates during the testing phase.

15 Production testing Get detailed plan from business outlining test scenarios,
test data, group responsible for testing and also the
schedule. Include it in the overall testing project plan.

16 Perfect Drive front end TBD


application and the admin
tool

Note: Lead System Tester (XYZZ) will be "accountable" for completion of testing activities listed below.
However, responsibilities for executing the tests below will be with folks identified below.

XXX DD-MM-YY Page 12 of 39


Test Plan XXX

4.6 Risks that impact testing

Risk Probabili Impac Contingency plan


ty t
Any of the deliverables by Medium High The testing time may be lost and to
the development team not compensate for this, additional
happening as per the Project resources would be added
plan. appropriately.

System failure and loss of Medium Low A database backup strategy should
data during the Testing be in place so that loss of data can
Process be prevented.

Test data not migrated in Medium Low Test the functionality not involving
time data feed until migration.

Connectivity during test Low Medium Local test setup should be in place.
execution from offshore Except scenarios involving
mainframes, others can be executed
locally.

4.7 Test Suspension/Resumption Criteria

Sanity test will be carried out on every build received from development team to ensure suitability of
application for further testing. A set of functional test cases will be identified to run during sanity test.
Testing will be suspended when it is not possible to proceed with test execution due to major
showstopper error in the application.

Testing shall be resumed once the above problems are addressed.

4.8 Test Stop Criteria

All planned tests have been executed.

XXX DD-MM-YY Page 13 of 39


Test Plan XXX

5 Test Case Design and Development


5.1 Test Case Design Instructions

1. Use following documents to create test cases:


• Booking Engine Use Cases EBR - Budget.com
• Non-Booking Engine Use Cases XXX.com Redesign EBR
• Page Specifications for Ecommerce XXX.com Redesign
• Non-Functional Requirements XXX.com Redesign EBR
• External interfaces document
• Data feeds
• Technical Description – Query String Parameters
• Splash Pages list
2. Generate test cases using Equivalence partitioning and Boundary value analysis techniques.
3. Mention trace-ability for the test cases from Use Case document.
4. Number of steps to execute the test case should not be more than 20. We can split such scenarios
in number of test cases for better tracking purpose.
5. Test case execution steps need to be in detail so that any tester can complete the test case
without any System knowledge.
6. Whenever required mention the business rules and formulas under expected result column for
reference.

5.2 Test Case Design Deliverables

Test Cases will be developed by Test team and reviewed by Business before test execution. In case of
requirements change, refer the Change Request Process defined in Approval section.

XXX DD-MM-YY Page 14 of 39


Test Plan XXX

6 Testing Team
6.1 Core Team

- Testing Project Manager:

- Lead System Tester:

- System Testers:

- User Acceptance Group Coordinator:

6.2 Technical Support Team

Please refer to Sharepoint for Technical support team details of eCommerce XXX.com Redesign project:

PMO Documents/Core PMO Documents/Roles and Responsibilities.doc

6.3 Assumptions, Constraints, and Exclusions

Any reference to the Testing Team will be those individuals listed above under “Core Team.”

XXX DD-MM-YY Page 15 of 39


Test Plan XXX

7 Testing Tools
7.1 Testing Tools

Manual: Test Cases will be created via Excel Templates.

Refer Templates section under Sharepoint for Test case template.

Automated: Automated scripts will be created/executed using the following:

• Quick Test Pro – by Mercury

• Load Runner – by Mercury

Defect Tracker: PVCS Tracker will be used as a defect tracking tool.

The testing tool related decision is pending for budget approval. The above listed tools are the proposed
testing tools.

7.2 Assumptions, Constraints, and Exclusions

QTP will be used for regression testing and Load Runner for load testing.

The test scripts will be created and executed by XYZ Testing Team offshore. The test scripts will be shared
with XXX for onsite execution at the later part of System Testing and again just prior to implementation.

Limitations of test automation:

• Problems with tool

• Support from vendor

• Rapidly changing requirements

XXX DD-MM-YY Page 16 of 39


Test Plan XXX

8 Key External Dependencies


Below is a list of all key external dependencies.

• Baseline and/or updates to the following documents:


• Booking Engine Use Cases EBR - Budget.com

• Non-Booking Engine Use Cases XXX.com Redesign EBR

• Page Specifications for Ecommerce XXX.com Redesign

• Non-Functional Requirements XXX.com Redesign EBR

• External interfaces document

• Data feeds

• Technical Description – Query String Parameters


• Splash Pages list

• Completion of Development & Unit Testing


• Completion of HTML
• Integration of HTML into development code
• Completion of Informative Pages
• Data migration for testing purposes
• Data feed documentation from Casey Miller for validating nightly feed data in application.

8.1 Assumptions, Constraints, and Exclusions

None to state at this time.

XXX DD-MM-YY Page 17 of 39


Test Plan XXX

9 Metrics collection
Detailed defect analysis shall be done for the reported defects and test case execution status shall be
reported for each module.
The metrics to be collected during test life cycle are:
1. Defect location Metrics – Defects raised against the module shall be plotted on a graph to indicate
the affected module.
2. Severity Metrics – Each defect has an associated severity (Critical, High, Medium and Low), which is
how much adverse impact the defect has or how important the functionality that is being affected by
the issue. Number of issues raised against severity shall be plotted on a graph. By examining the
severity of a project’s issues, the discrepancies can be identified.
3. Defect Closure Metrics – To indicate progress, the number of raised and closed defects against time
shall be plotted on a graph.
4. Defect Status Metrics – It will indicate the number of defects in various states like, new, assigned,
resolved, verified, etc.
5. Re-opened bugs – The number of defects re-opened by testing team once they are fixed by
development team shall be reported & percentage shall be calculated with respect to total number of
defects logged.
6. Test case progression trend: This trend shall indicate the progress of test execution module wise. It
shall state the number of test cases planned, executed, passed and failed.
These metrics shall be collected and presented as test summary report after each test cycle. Also, these
shall be part of weekly status report.
Refer Templates section under Sharepoint for Metrics Analysis template.

XXX DD-MM-YY Page 18 of 39


Test Plan XXX

10 Classification of Issues
The following standard will be used by all involved parties to identify issues found during testing:

Severity 1: Critical Issues: Application crashes, returns erroneous results, or hangs in a major area of
functionality and there is no work around. Examples include the inability to navigate to/from a function,
application timeout, and incorrect application of business rules.

Severity 2: High Functional Issues: Functionality is significantly impaired. Either a task cannot be
accomplished or a major work around is necessary. Examples include erroneous error handling, partial
results returned, and form pre-population errors.

Severity 3: Medium Functional Issues: Functionality is somewhat impaired. Minor work around is
necessary to complete the task. Examples include inconsistent keyboard actions (e.g. tabbing), dropdown
list sort errors, navigational inconsistencies, and serious format errors causing usage issues (e.g. incorrect
grouping of buttons).

Severity 4: Low Functional Issues: Functionality can be accomplished, but either an annoyance is
present, or efficiency can be improved. Cosmetic or appearance modifications to improve usability fall
into this category. Examples include spelling errors, format errors, and confusing error messages.

Examples of each severity level to be delivered to participants prior to Integration Testing.

10.1 Assumptions, Constraints, and Exclusions

Issue classifications can include creative or content related issues.

XXX DD-MM-YY Page 19 of 39


Test Plan XXX

11 Unit Testing
11.1 Purpose

The purpose of Unit Testing is to deliver code that has been tested for end-to-end functionality within a
given module and normal interfacing between dependent modules in the development environment.

11.2 Responsibility

Testing will be the responsibility of the individual developers. Ultimate signoff for promotion into
Integration Testing will be the responsibility of the Development Project Manager(s). Configuration
management, builds, etc. will be the responsibility of the Configuration Management Team at the
direction of the Development Project Manager.

11.3 Environment

Refer section 4.4 Test Environment Details

11.4 Exit Criteria

In order to be accepted for Integration Test, each component must:

• Successfully compile in the development environment

• Be tested for complete threads for all code, from UI, to data access, and back to UI using test
data created by developers

• Be tested for one example each of normal, high, and low boundary conditions for Data input
where appropriate

• Tested for one example of error handling per event

• Successfully execute pairwise test as required for inter-module interfaces, including likely error
conditions (e.g. common data entry error)

• No high Severity issues in open state.

• Have Project Manager signoff on individual modules

Specifically excluded from the Unit Test exit criteria are:

• Comprehensive data validation

• Exhaustive test of various entry and exit points across modules

• Comprehensive testing for abnormal situations and error handling combinations

11.5 Assumptions, Constraints, and Exclusions

Creation of test data and scripts for the purpose of Unit Testing is the responsibility of the development
staff(s).

XXX DD-MM-YY Page 20 of 39


Test Plan XXX

12 Integration Testing
12.1 Purpose

The purpose of Integration Testing is to deliver code that has been comprehensively tested for Normal (N)
and Exception (E) conditions across all modules in the Development environment.

12.2 Responsibility

Testing Team holds the primary responsibility for the execution of Normal (N) and Exception (E) test
scripts. All N & E type test scripts will be completed prior to the start of Integration Testing. The N & E
test scripts will be executed for the following modules:

• Homepage

• Reservations

• Rates

• Analytics

• Profile Management

• Locations

• Personalization

• Search

• Admin tools

• Visitor Management

• Content Management

• Administration

Configuration management, builds, etc. will be the responsibility of the Configuration Management Team
at the direction of and agreement the Development Project Managers and Testing Project Manager.
Ultimate sign-off of Integration Testing and promotion into System Testing resides with the Testing Project
Manager.

12.3 Environment

Refer to Section 4.4 Test Environment Details.

XXX DD-MM-YY Page 21 of 39


Test Plan XXX

12.4 Test Data

12.4.1Mainframe test data


Test Lead will request for test data migration/creation. Test Lead will forward details surrounding the
migration/creation to the appropriate individuals. After data migration/creation test data list will be
forwarded to Test Lead by mainframe team.

12.4.2Local test data


Test Lead will identify test data for local database (Oracle 9i). With help of development team, Test Lead
will ensure test data is set up before the start of integration testing.

A Testing Data Repository Document will be delivered on or before Integration Testing. Specific reference
will be made in the N & E Test Scripts to the data types listed in the Testing Data Repository Document.

12.5 Test Execution Process

Integration Testing will be comprised of two (2) weeks.

Issue Identification: Integration Testers will log issues as they are identified.

Issue Resolution: It is expected that the Development Team(s) will undertake issue resolution based on
the severity and priority. All efforts will be made to turn around the Critical/High category issues, in
the next scheduled Build/Release.

Build and Release Process: The Configuration Management Team will deliver fresh builds as requested
by the Development Project Managers and Testing Project Manager along with release notes.

Issue Closure: After each build, the Testing Team will review the issues that have been resolved in
order to verify and close/re-instate the issues and resolution priority. All effort will be made to close
the resolved issues as soon as possible.

Issue Tracking: The Testing Project Manager will be responsible for the administration of the issue
tracking tool.

12.6 Exit Criteria

In order to be accepted for System Test, each component must:

• Be successfully deployed to the System Test Environment

• For transaction-based data access, be tested successfully for “normal” and “exception”
conditions

• Contain no “dead” links/inaccessible pages

• Contain no Severity 1 or 2 issues

• Have Testing Project Manager signoff

XXX DD-MM-YY Page 22 of 39


Test Plan XXX

12.7 Additional Information

Test scripts will be provided to the Development Project Managers prior to Integration Testing. The test
scripts provided should be used as a baseline for exit criteria expectations. Any additional test or scripts
that the development staff deems necessary will be left at the discretion of the Development Project
Managers. Should the Development Project Managers feel that such scripts should be incorporated into
the Testing Team scripts, they may request such to the Testing Project Manager. It will be the
responsibility of the Testing Project Manager to analyze the feasibility of such incorporation.

12.8 Assumptions, Constraints, and Exclusions

Assumptions: Functionality testing of XXX.com by the Testing Team will also include entry points from
other websites via link, travel portals, etc.

Exclusions from Integration Testing: Delivery of code that has been comprehensively tested for Data
Normal (DN) and Data Exception (DE) conditions across all modules in the Development environment. Any
issues discovered with the informative pages, creative design, or content should be reported to respective
development area.

XXX DD-MM-YY Page 23 of 39


Test Plan XXX

13 System (QA) Testing


13.1 Purpose

The purpose of System Testing is to deliver code that has been comprehensively tested and functionality
that is certified to be end-to-end user ready in the System Test environment.

13.2 Responsibility

The Testing Team holds the primary responsibility for the executions of Normal (N), Exception (E), Data
Normal (DN), and Data Exception (DE) test scripts. Test scripts will include field form validation and
display rules as stated in the Elements section of the Page Specifications for eCommerce XXX.com
Redesign. The N, E, DN, & DE test scripts will be executed for the following modules:

1. Homepage

2. Reservations

3. Rates

4. Analytics

5. Profile Management

6. Locations

7. Personalization

8. Search

9. Admin tools

10. Visitor Management

11. Content Management

12. Administration

The System Testing Team will be comprised of individuals from the Testing Staff. Configuration
management, builds, etc. will be the responsibility of the Configuration Management Team at the
direction of the Development Project Managers and requires the agreement of the Testing Project
Manager. Ultimate sign-off of System Testing and promotion into User Acceptance Testing resides with the
Testing Project Manager.

13.3 Environment

Refer section 4.4 Test Environment Details

XXX DD-MM-YY Page 24 of 39


Test Plan XXX

System Test Script execution will be completed as per the following Operating System/Browser matrix:

Windows XP Windows 2000 Windows 98 Mac OS/9

IE 6.0 C C

IE 5.5 U

IE 5.0 U U

Mozilla 1.7.2 C U

Netscape 7.1 U

AOL 5.0 U

C – Complete test cases suite will be executed on OS/Browser combination


U – Only critical functionalities and UI test cases will be executed on OS/Browser combination
Blank – OS/Browser combination will not be tested

13.4 Test Data

13.4.1Mainframe test data


Test Lead will request for test data migration/creation. Test Lead will forward details surrounding the
migration/creation to the appropriate individuals. After data migration/creation test data list will be
forwarded to Test Lead by mainframe team.

13.4.2Local test data


Test Lead will identify test data for local database (Oracle 9i). With help of development team, Test Lead
will ensure test data is set up before the start of integration testing.

A Testing Data Repository Document will be delivered on or before System Testing. Specific reference will
be made in the N, E, DN, & DE Test Scripts to the data types listed in the Testing Data Repository
Document. Additional specific data may be required. Should this be the case, the data will be listed on
the corresponding test script.

13.5 Test Execution Process

System Testing will be comprised of six (6) weeks.

Issue Identification: System Testers will log issues as they are identified.

Issue Resolution: It is expected that the Development Team(s) will undertake issue resolution based on
the severity and priority. All efforts will be made to turn around the Critical/High category issues, in
the next scheduled Build/Release.

Build and Release Process: The Configuration Management Team will deliver fresh builds to the System
Test environment, as directed by the Testing Project Manager along with release notes. If the situation

XXX DD-MM-YY Page 25 of 39


Test Plan XXX

warrants, an emergency build may be released. The Testing Project Manager and all Development
Project Managers must be in agreement to proceed with the emergency build.

Issue Closure: The Testing Team will review the issues, which have been resolved, to verify and
close/re-instate the issues and resolution priority. All effort will be made to close the resolved issues
as soon as possible.

Issue Tracking: The Testing Team will be responsible for the administration of the tracking tool.

13.6 Exit Criteria

In order to be accepted for User Acceptance Test, the application must:

• Be successfully deployed to the System Test environment

• Tested by System Test Team according to all System Test Scripts

• Performance/Load tested

• No Severity 1 or 2 issues

• Minimal Severity 3 or 4 issues, documented and with a known resolution path.

• Testing Project Manager signoff and acceptance by User Acceptance Team.

Specifically excluded from the System Test exit criteria are:

• Security Testing

• System Crash/Restart Testing

o DB crash

o iPlanet crash

o Hardware

o DB capacity/resources

13.7 Additional Information

Regression and Load testing will take place prior to promotion to the User Acceptance Testing. Please see
the Regression Testing and Load Testing sections of this document for further information.

A copy of the test plan will be provided to the User Acceptance Group prior to System Testing for their
review. The test plan provided should be viewed as a baseline for System Testing exit criteria
expectations. Any items in the test plan that the System Testing Team or User Acceptance Group feels
should be modified or added should be submitted to the Testing Project Manager. It will be the
responsibility of the Testing Project Manager to analyze the feasibility of such incorporation or
modification.

XXX DD-MM-YY Page 26 of 39


Test Plan XXX

13.8 Assumptions, Constraints, and Exclusions

Assumptions:

Functionality testing of XXX.com by the Testing Team will also include entry points from other websites
via link, travel portals, etc.

Testing of Personalization engine will be limited to business rules created by developer.

Visitor tracking details will be verified only at jsp level by “View Source” as Reporting tool has not been
finalized.

XXX DD-MM-YY Page 27 of 39


Test Plan XXX

14 Mainframe testing
14.1 Purpose

The purpose of Mainframe testing is to deliver the stable code for new functionality – Rate Shop and also
to verify if existing functionality works fine with new XXX.com application.

14.2 Responsibility

Mainframe testing will be carried out by Mainframe QA team at XXX. It will be scheduled and coordinated
by XYZ Test team according to test execution dates for System testing, UAT & PAT.

14.3 Environment

Mainframe modules will reside in the Budget Highway Acceptance Test Region (ATR).

14.4 Test Execution Process

Mainframe QA team at XXX will deliver the unit tested code of Rate Shop feature to XYZ Development
team. After integration with application, XYZ Test team will verify the Rate Shop feature from end to end
user perspective, i.e. from front end till mainframe database. XYZ Test team will be trained on using
Mainframe screens to verify rates and reservation data. XYZ Test team will raise issues using PVCS Tracker
and escalate to IT Project Manager (Alfredo Palacios), who will take it further with Mainframe team for
fixes.

To follow up on Mainframe testing progress (during mainframe testing period), a status report will be
provided to Budget team on weekly basis by Mainframe QA team.

During test execution, XYZ test team will provide with list of reservations to Mainframe QA team so that it
can verify converting reservations to rentals and generating rental agreement number successfully.

XXX DD-MM-YY Page 28 of 39


Test Plan XXX

15 Load Testing
15.1 Purpose

The purpose of Load Testing is to deliver code that has been load tested and is ready for promotion into
the Production Environment.

15.2 Scope

Load Testing will consist of a select group of (N) scripts that accurately represent a cross section of
functionality. Scripts will be executed to generate up to 1500 user peak load levels.

Tests will be executed for 1500 concurrent users at load levels of 50, 100, 200, 500, 1000 and 1500. The
test execution would be completed when the 2000 user load is ramped up or any failure condition
necessitates stopping the test. The team would monitor the test execution and record the timings, errors
for report preparation.

15.3 Responsibility

The creation and execution of the Load Testing Scripts is the responsibility of the Testing Team. Ultimate
authority rests with the Testing Project Manager, who will be in close contact with User Acceptance
Group.

15.4 Environment

The Mercury – Load Runner tool will be physically located on a server at Denver. For the purpose of test
execution, Load Runner tool will be pointed to the System Testing Environment, which will become the
Production Environment upon Implementation. XYZ Test team will access Load Runner using remote client
tool to execute the scripts. Offshore Test team will be allocated one VU Gen license to create scripts
offline.

Description IP Address
Budget Application
Servers
Controller
Load Generator

DB Server

Web Server

Details pertaining to Network:

• Network Card setting – 100 MBPS Duplex

• Bandwidth of LAN – 100 MBPS

XXX DD-MM-YY Page 29 of 39


Test Plan XXX

15.5 Testing Methodology

15.5.1Load testing
Load testing will be carried out under varying workloads to access and evaluate the ability of the system
under test to continue to function properly under these different workloads. The goal of load testing is to
determine and ensure that the system functions properly beyond the expected maximum workload.
Additionally, load testing evaluates the performance characteristics (response times, transaction rates,
and other time sensitive issues).

15.5.1.1Serviceability

Approach
- Determine the serviceability of the system for a volume of 1500 concurrent users.
- Measure response times for users

Steps
1. Virtual users estimation: Arrive at a maximum number of concurrent users hitting the system
where the system response time is within the response time threshold and the system is stable. This
number would be the virtual user number and should be higher by a factor of x times the average load.

2. Virtual users profiles and their distribution for client operations:

Corporate Program

XXX Employee
Travel Agent
Unaffiliated consumer

XXX Partner
UB Program member
PD member
FB member

member

Homepage load

Login/Logout

Rate Request
response

Rate Request –
response for multi
BCD rate shop

Create a booking

Modify/Cancel
booking

One click booking

3. Load simulation schedule:

Schedule for concurrent user testing with a mix of user scenarios and the acceptable response
times:

XXX DD-MM-YY Page 30 of 39


Test Plan XXX

On Dial-up (56 Kbps) On Broadband


Homepage Load 19 seconds NA – As Dial-up is considered
more relevant.
Log-in/ Log-out NA – As Broadband is Sub ‘second’
considered more relevant.
Rate Request-Response NA – As above Sub 20 seconds
Rate Request – Response for NA – As above Sub 20 seconds
multi-BCD rate shop
Create a booking NA – As above 1 min: 30 seconds (inclusive of
mandatory intermediate steps)
Modify / Cancel Booking NA – As above 1 min: 3 seconds (inclusive of
mandatory intermediate steps)
One-Click Booking NA – As above 30 seconds

Statistics

• The graph with y-axis representing response times and x-axis concurrent users will depict the
capability of the system to service concurrent users.

• The response times for slow users will provide worst-case response times

15.5.2Endurance testing
Validate systems behavior for continuous hours of operation for projected load conditions.

Number of continuous hours of operation is to be discussed with Business

Approach

- Endurance testing – check resource usage and release namely; CPU, Memory, Disk I/O and network
(TCP/IP sockets) congestion for continuous hours of operation

- Determine the robustness - check for breakages in the web server, application server and data
base server under CHO conditions.

Steps

1. Arrive at a base line configuration of the web server and application server resources i.e. CPU,
RAM and Hard disk for the endurance and reliability test.

2. The test would be stopped when one of the components breaks. A root cause analysis is to be
carried out based on the data collection described under the server side monitoring section.

Client side monitoring

- Failure rate -- web server responses/timeouts/exceptions and incomplete page downloads

- Response time degradation under peak load numbers (concurrent users)

Server side monitoring

- Collect CPU, Disk and Memory usage for analysis

XXX DD-MM-YY Page 31 of 39


Test Plan XXX

- Check for application server slow down/freeze/crash

- Check for resource contention/deadlocks

- Database server load and slow down

- Web server crashes

- Collect data for analysis to tune the performance of web server, application server and database
server

- If there is an alarm support in the tool through an agent, check for alerts when the activity level
exceeds preset limits.

- If there is a load balancing configuration deployed, check if it is able to distribute the requests

Result

The result of this test will be a proof of confidence for Continuous Hours of Operation. The data
collected in this phase would give pointers to improve the reliability of the system and fix any
configuration, component parameters for reliable performance.

15.5.3Planned testing cycles


Load testing will be done on XXX.com Web Application against a range of operational conditions and
factors including network bandwidth, data volumes and transaction frequency. The test cycles shall be run
on the network measuring performance from 50 users to 1500 users.

The test cycle shall be run for 50 users initially (lets say incrementing 5 users per 5 seconds till it reaches
50 concurrent users). The test shall be stopped if application crashes before reaching 50 users and issue
shall be reported to development team. The response time shall be noted for 50 concurrent users before
stopping the test. If the response time is exceeding the benchmark limit, load test shall be stopped until
development team fixes the issue. If the response time is well within benchmark limit, fresh test cycle
shall be run with an aim to reach 100 concurrent users. The same process shall be used until 1500
concurrent users target is met within acceptable response time.

The response times will be noted for the following user loads within the same test cycle:

50 users
100 users
200 users
500 users
1000 users
1500 users

The first cycle of Load testing will be carried out on QA environment and second cycle on Production
environment during System testing phase.

15.6 Metrics to be measured

Client Side Primary Metrics:

• Response Time

XXX DD-MM-YY Page 32 of 39


Test Plan XXX

• Throughput

• Concurrent users

OS Level Primary Metrics:

• Processor Usage

• Memory Usage

• Disk I/O Rates

App Server Primary Metrics:

To be discussed with Technical team.

15.7 Test Deliverables

- Test scripts using Load Runner

- Performance test report containing all the analysis graphs

15.8 Assumptions, Limitations and Constraints

Assumptions

1. The Transaction mix (user mix) shall be provided by XXX Business team.

2. The XXX Team shall provide the application setup. The application provided would have ensured
the following.

- Successfully deployed to the Test environment.

- Tested by System Test Team according to System Test scripts.

- All the requirements verified for as per the requirements specification

Constraints

If Load test scripts shall be executed from offshore, network delay shall add up in response times.

15.9 Exit Criteria

In order for Load Testing to be considered successful the Load Scripts must be successfully be executed
under the following conditions:

• Executed by Test Team

• Simulate up to 1500 virtual users

XXX DD-MM-YY Page 33 of 39


Test Plan XXX

• Testing Project Manager sign-off

• Meet the exit criteria for the Phase in which the Load Test is executed

XXX DD-MM-YY Page 34 of 39


Test Plan XXX

16 Regression Testing
16.1 Purpose

Deliver code that has been regression tested and is ready for promotion into the Production Environment.
Regression Testing will consist of a majority of (D), (N), (DN), and (DE) type test scripts.

16.2 Responsibility

The creation of the Regression Testing Scripts is the responsibility of the Testing Team. Regression Test
Scripts will be created and executed using Mercury – Quick Test Pro. The execution of the Regression
Testing Scripts is the responsibility the Testing Team.

16.3 Environment

The Mercury – Quick Test Pro software will be physically located in XYZ, Bangalore. For the purpose of
test execution, QTP will be pointed to the System Testing Environment.

16.4 Exit Criteria

In order for Regression Testing to be considered successful the results must meet the exit criteria stated
in the corresponding testing phase exit criteria. For example, Regression Scripts executed during the
System Testing phase must meet the exit criteria stated in the System Testing section of this document.

XXX DD-MM-YY Page 35 of 39


Test Plan XXX

17 User Acceptance Testing


17.1 Purpose

The purpose of User Acceptance Testing is to deliver code that has been tested by the User Acceptance
Test Group and functionality that is certified to be end-to-end user ready for promotion into the
Production Environment.

17.2 Responsibility

User Acceptance Testing is to be executed by the User Acceptance Group (Business). Management of the
User Acceptance Testing Phase will be the responsibility of the Testing Project Manager via the User
Acceptance Group Coordinator. The test scripts used during User Acceptance Testing are to be created by
Test Team with the help of Business Analyst and User Acceptance Group. Test scripts should accurately
reflect the functionality documented in the Booking Engine Use Cases EBR - Budget.com, Non-Booking
Engine Use Cases XXX.com Redesign EBR and Page Specifications for Ecommerce XXX.com Redesign.

Ultimate authority rests with the Testing Project Manager, who will be in close contact with the User
Acceptance Group Coordinator. Configuration management, builds, etc. will be the responsibility of the
Configuration Management Team at the direction of the Development Project Manager and requires the
agreement of the Testing Project Manager.

17.3 Environment

Refer section 4.4 Test Environment Details

17.4 Test Data

The requesting of data migration/creation is the responsibility of the Testing Project Manager. Details
surrounding the migration/creation will be forwarded to the appropriate individuals. A Testing Data
Repository Document will be delivered on or before User Acceptance Testing. Specific reference will be
made in the N & E Test Scripts to the data types listed in the Testing Data Repository Document.
Additional specific data may be required. Should this be the case, the data will be listed on the
corresponding test script.

17.5 Test Execution Process

User Acceptance Testing will be comprised of five (5) weeks.

Issue Identification: UAT Testers will log issues as they are identified.

Issue Resolution: It is expected that the Development Team(s) will undertake issue resolution based on the
severity and priority. All efforts will be made to turn around the Critical/High category issues, in the next
scheduled Build/Release.

Build and Release Process: The Configuration Management Team will deliver fresh builds to the UAT Test
environment, as directed by the Testing Project Manager along with release notes. If the situation
warrants, an emergency build may be released. The Testing Project Manager and all Development Project
Managers must be in agreement to proceed with the emergency build.

XXX DD-MM-YY Page 36 of 39


Test Plan XXX

Issue Closure: The UAT Testing Team (Business) will review the issues, which have been resolved, to verify
and close/re-instate the issues and resolution priority. All effort will be made to close the resolved issues
as soon as possible.

Issue Tracking: The Testing Team will be responsible for the administration of the tracking tool.

UAT group
UAT
UAT group tester logs Yes
UAT coordinator coordinator Developer
tester defect in PVCS
reviews defect for assigns fixes
identifies Tracker and
validity & details defect to defect
defect assigns to UAT
developer
coordinator
No

Defect is re-assigned to tester


for more clarifications

No
New
application UAT group Yes
Test team
version is tester
verifies the Defect passed? Close defect
released into verifies fixed
defect
UAT env. with defect
Release Notes
17.6 Exit Criteria

In order to be accepted for promotion to the Production Environment, the application must:

• Tested by User Acceptance Group (Business)

• Regression and Load Tested (responsibility of the Testing Team)

• No Severity 1, 2 or 3 issues

• Minimal Severity 4 issues, documented and with a known resolution path

• User Acceptance Group must give the approval for Severity 4 issues to be included into production
release

• User Acceptance Group Coordinator and Testing Project Manager sign-off

Specifically excluded from User Acceptance Test exit criteria are:

• Security Testing

• System Crash/Restart Testing

XXX DD-MM-YY Page 37 of 39


Test Plan XXX

o DB crash

o iPlanet crash

o Hardware

o DB capacity/resources

17.7 Additional Information

Regression and Load testing will take place prior to promotion to the Production Environment. Please see
the Regression Testing and Load Testing sections of this document for further information.

17.8 Assumptions, Constraints, and Exclusions

• UAT is conducted as per documented and signed off requirements

• Any changes or new functionalities that come up during UAT will go through Change Management
process

• System is stable and available during scheduled testing period.

XXX DD-MM-YY Page 38 of 39


Test Plan XXX

18 Soft Launch
TBD (will enter details after discussing with Business)

XXX DD-MM-YY Page 39 of 39

You might also like