Professional Documents
Culture Documents
Purpose
The purpose of this slide deck is to provide Accenture projects and resources with first hand project experience using the Oracle Application Testing Suite (OATS) and in particular the Oracle Load Tester (OLT) when performance testing Oracle Retail Predictive Application Server (RPAS) This guide will provide all the information necessary to properly plan, set up, execute and report on RPAS Oracle Fusion Client performance tests using OATS This is not an installation guide and will not provide installation steps for all the elements, for this type of information please refer to the reference section at the end of this document Although not every single procedure and step needed to create an end to end performance test is listed in this document, it does provide practical advice that will allow the reader to fast track the overall process and avoid some potential pitfalls This guide should answer the most relevant questions a project team may have with regards to testing the Oracle Fusion Client using OATS The results of RPAS MFP tests conducted can be found at the following location https://kx.accenture.com/Repositories/ContributionForm.aspx?path=C22/39/4&mode=Read
Agenda
Overview Oracle RPAS Fusion Client and ADF explained The Oracle Application Testing Suite Why Oracle Load Tester? Oracle Load Tester architecture System requirements for OATS Other performance testing considerations Practical OpenScript Advice Parameterisation of scripts Performance testing process Test execution Performance test reporting Room for improvement Lessons learnt References
Overview
Intended Audience This document is intended for all Accenture TA teams involved with Performance Testing/Performance Engineering activities related to Oracle RPAS. The focus of the document is Oracle RPAS but the same principle could be used in testing other web based applications. Pre requisite skill or knowledge There are some skills/knowledge that would be beneficial to achieving the desired outcome in terms of performance testing the Oracle RPAS Fusion Client using OATS Oracle RPAS (preferable) Java development (preferable) Oracle ADF (preferable) A basic understanding of application testing, load testing, scalability testing etc. (required)
Agent 1
Agent 2
Target Application
WebLogic Server
Agent 3
Agent n OLT DB
8
Hardware
2 x 3 GHz processor 4 GB RAM 50 GB HDD IE 8 or 9 Oracle XE db (or full Oracle db) 2 x 3 GHz processor 4 GB RAM 50 GB HDD IE 8 or 9 Oracle XE db (or full Oracle db) 1 x 3 GHz processor 2 GB RAM 50 GB HDD IE 8 or 9 1 x 3GHz processor 2 GB RAM 50 GB HDD IE 8 or 9
Software
Windows Server 2003 R2 / Windows Server 2008
OpenScript
Monitoring Although OATS has its own monitoring capability, it would be advisable to make use of other tools such as nmon or Enterprise Manager to collect and graph statistics In particular the following must be included when monitoring RPAS performance
Component Area Fusion Front End Server Metric JVM Heap size JVM Garbage collection CPU utilization percentage CPU utilization percentage Memory (utilization, swapping/paging) Server disk I/O
If the clients are accessing the RPAS server across the WAN some utilisation stats should be gathered to determine what the effect the users have on available bandwidth
Cost Licensing for OATS can be expensive, the cost for load testing is calculated using the number of concurrent users on the system The Functional Tester is charged by the number of users developing scripts
10
Define scope
Model scripts
Dry run
Execute
Report
1. Define scope Determine concurrent user loads Determine performance test type Determine user scenarios Define test environment 2. Gather requirements Develop and document test scripts Determine user allocations Determine data requirements Define testing pre requisites 3. Prepare environment Load data Create system users in RPAS Develop and execute any prep scripts Set up load testing agents in OLT Set up ServerStats in OLT Set up any additional monitoring
4. Model scripts Build and test basic scripts in OpenScript for each scenario Parameterise and test each script in OpenScript Define think times for each completed script
5. Dry run Build complete scenarios in OLT Execute all scenarios with 25% of planned user volume load Correct any errors and iterate until error free Complete an hour long dry run
6. Execute Execute all scenarios with user volume loads defined in scope Monitor and capture all infrastructure stats Execute multiple iterations of test 7. Report Gather all transactional statistics Gather all infrastructure statistics Develop final report
11
As you record your scripts OLT will number and name your scripts step groups This naming and numbering is generally always incorrect, so its best to switch auto step creation off This is done by going to the Script Properties
Ensure that the Step Group is set to Do not create/number/name steps The step groups are what you will report against in terms of transactions so its vital they are correct
12
Create a new Step Group for each transaction that you will be recording
Number the step using curly bracket and give it an appropriate title e.g. [1] Open webpage
13
Click on the record button to initiate the script recording, a browser window should open As you complete recording one step you should click pause and create the next Step Group e.g. [2] Login
Once completed the script should look similar to the figure on the left
14
Parameterisation of Scripts
Using Databanks After the initial script is recorded and tested, the script will need to parameterised, so that it can be executed by multiple users or have different inputs during execution Before attempting to parameterise scripts, the data being injected into the scripts should be identified and a csv file with the relevant data must be created and saved to [INSTALLDIR]\OFT\DataBank The format of the file is the name and corresponding data of each of the elements separated by a comma This databank file is then added to the script during the parameterisation set up Initially it would be a good idea to make a copy of your working script before you start altering it with parameterisation The most obvious place you would need multiple inputs is during logon, this will allow multiple users to logon to the system by injecting data from the databanks The databank depicted injects the username and domain the users are logging on to. It also injects specific departments for each user during the execution of the test
15
Parameterisation of Scripts
Substitute Databank Variables After the initial script is recorded and tested, the script will need to parameterised, so that it can be executed by multiple users or have different inputs during execution Add the Databank created previously to the new script as per the user guide. Save the script and the Databank should now be available. To parameterise the script, expand the script in the step group where values are being substituted A good way to find the correct variable is to look for the data originally used during the recording e.g. pftest051
Right click the appropriate parameter under Post Data and select substitute This method of parameterisation will work when OpenScript adequatley identifies the window in which the data is being entered like a logon screen
16
Parameterisation of Scripts
Custom Variables After the initial script is recorded and tested, the script will need to parameterised, so that it can be executed by multiple users or have different inputs during execution In some instances OpenScript cannot substitute the variables you are looking to replace because OpenSscript may not recognise the field you are replacing In these instances you need to directly substitute the parameters with the details of your databank and the data you are replacing Right click the appropriate parameter under Post Data and select properties Replace the recorded data e.g. D+129+Watches
The reference to the databank would be {{db.MFP_edgars_users.Dept,D 129 WATCHES}} The databank in this example is called MFP_edgars_users The variable is Dept and the actual example of the data would be D 129 WATCHES Double curly brackets are used to wrap this custom parameter
17
Test execution
All load tests are set up and executed via the OLT user interface
Create a new scenario in OLT and add all the scripts developed in OpenScript to the scenario Allocate the number of users to the scenario, allocate the scripts to a load testing agent (system) and configure any additional parameters needed Once all the scripts have been added and configured add them to the Autopilot
Configure when the load test is started (specific time or when button is pressed) Configure how the load test will be stopped (either after the delay or once the button is pressed) Configure the VU ramp up Select the ServerStats configuration previously set up (verify the correct monitors are in place) Click the green start button
18
Infrastructure reporting If ServerStats has been configured infrastructure metrics will be gathered during test execution These metrics can then be used for creating graphs relating to the metrics that have been configured It is still recommended to gather additional stats relating to infrastructure monitoring e.g. Nmon
19
20
Lessons learnt
Several lessons learnt using OATS
A good understanding of the application and what it does is needed before diving into developing of scripts. This understanding is needed to know what is possible/practical in OLT as well as what can be parameterised or not Ensure the functional team is intimately involved with the development of the test scenarios/scripts and that they understand what is practical in terms of performance testing using OLT Think times are critical to generating load, ensure the think times specified are as realistic as possible As much as possible, the RPAS code should be stable to ensure that there are no changes to the environment as this may require re recording of scripts Any updates to the Fusion environment i.e. Fusion Client can result in working OLT scripts failing after the upgrade, this may require re recording of scripts Make use of OpenScript wherever possible to set up users by creating simple scripts to perform basic set up activities in the test environment Setting up and preparing for testing takes up the majority of performance testing time, allocate enough time for these activities. This talks specifically to the point below Once all data has been set up in RPAS e.g. users created, passwords reset, views created, a copy of the domain must be taken prior to testing. In the event that the domain is corrupted there is no need to re run all the set up activities
21
References
The following documents were used as reference material when compiling this document
Oracle Application Testing Suite Getting Started Guide http://download.oracle.com/otn/nt/apptesting/oats-docs-9.31.0036.zip Oracle Load Testing Load Testing Users Guide http://download.oracle.com/otn/nt/apptesting/oats-docs-9.31.0036.zip Oracle ADF overview http://www.oracle.com/technetwork/developer-tools/adf/adf-11-overview-1-129504.pdf
22