You are on page 1of 23

GDW

MONITORING
Abstract

This document contains all


the information about
GDW Monitoring Process.

Process Chains Monitoring

This document contains all the processes which need to


be followed in GDW Monitoring.
GDW is global Data warehouse. Data from various modules like MM, FI-CO, PP,
SD, AR, AP etc. Flows in GDW. Hence GDW is connected with several source
system like SAP and NON-SAP both.
We are having a mechanism called extraction in GDW which extracts data from
Source system. In SAP GDW we have to types of data loads:
1. ECC data loads
2. Flat File Data Loads
1. ECC data Loads: In this type of data load, data source directly extracts data
from source system and load into GDW (PSA).
2. Flat File Data Loads: In this type of load, data source is not directly connected
to source system. We load data in form of flat file (.CSV). Again this type of
load is sub divided in 2 category.
A. Load from Local work station
B. Load from file server
A. Load from Local work station: we can load flat file from our local desktop.
B. Load from file Server: we will keep flat file in file server (T-code: AL11) at any
predefined location and Info Package will pick file from file server and load in
BW.

In GDW, monitoring means keep eyes on process chains running status and
correct them if any error occurred. So our final motto is to make data available
on time for reporting. Hence you can understand how important GDW monitoring
is. Successful completion of process chain is very important in GDW. Can you
imagine a situation where developer has developed GDW report but data load is
not completing for that report? In this case there is no use of having that report,
because it is not fulfilling business requirement.
I hope now importance of GDW monitoring is clear to you.
Now lets move to actual work which you need to take care.
All the available process chains in GDW can be checked in t-code RSPC.
Procedure to login:
1. Open SAP Logon Pad and select BW Production (WP1) system with CLIENT
100.

2. Provide User ID and password.

3. After providing id and password, you will get below screen.

4. Open Tcode- ST13 as below:

Steps to access ST13:


1. Execute t-code ST13

2. Provide Tool Name as BW-TOOLS and click on execute.

3. Here select Process Chain Analysis and click on execute.

4. Click on Process Chains.

5. Here select start date and End Date according to your requirement and click
on execute then you will get below screen.

6. In this screen all the process chain will be available (Running and completed
both).
There are different columns available here:
Status: Current status of process chain.

Yellow: In Progress
Green: Completed
Red: Failed with Error

Steps: Status of local chain if any.


Main: it will be marked if it is Meta chain else will be not marked.
Chain: Process chain name.
Log-id: Process chain ID.
Sub Chains: No. of local chains.
Steps: No. of steps.
Day: Day on which process chain executed.
Date: dated on which process chain executed.
Time: Time on which process chain executed.
Run Time: Total time taken by process chain.
Run Time [Sec]: Total time taken by process chain in Seconds.
End Date: Date on which process chain completed.
End Time: Time on which process chain completed.
Progress of process chain can be checked by clicking on Log-id.
If process chain completed successfully then it is fine. But in case of failure we
need to correct it immediately by taking correct action.

Expected actions: Below are expected actions from GDW monitoring team.

Monitor ST13 regularly


If any process chain fails then contact GDW team.
All email communication should happen with C&D ID only.
In case of failure you will receive email form AT&T team. Respond to that
email as below:

FW
Ticket#INC2646897 Alert Error Incident - jobId 35976036 (ZPGDW_P2P_MMD_METACHAIN) - jobtopid 35976024 - on GDW_Queue_100.ms

Note: Always mark HAAS-SAP-REDWOOD@list.att.com , In case of any ad hoc


run or rerun of failed jobs is required from redwood end.
Mail to trigger next dependent job should go only after completion of Meta chain.

FW C&D Batch Job


Report-16th June 2015 at 05 08 40 AM EST.msg

You will receive email alert if any process chain is running late or not
started on scheduled time. In this case also you need to respond to AT&T
team as per below email.

FW Alert
EventWait - jobId 35982803 (ZPGDWCHAIN_O2C_FIN_LOAD_01)-Not Start- jobtopid 35982803-on GDW_Queue_100.msg

Always send email once PC ZPGDW_ALL_MDD_MASTER_ATTR and


ZPGDW_ALL_MDD_MASTER_META completed.

FW DSR Status on
06 17 2015.msg

Once PC ZPGDW_O2C_SDD_DSR and ZPGDW_O2C_DS_APD completed then send


email as per below format.

Report distribution will start at 6:00 AM EST. Check for DSR report
distributed or not as per below sheet.

DSR Report
Distribution List.xlsx

Send final email communication for acknowledgment of successful


completion of DSR Process chain and DSR report distribution. Check below
email.

FW DSR Status on
06 18 2015.msg

Introduction of Process chain:


Process chains are a robust graphical scheduling & monitoring tool to maintain
automation, visualization &monitoring of tasks/processes. The Info Package controls the
transfer of data from the source to the entry layer of BI, InfoPackage loads data from a Source
System only up to PSA.

To monitor the process chain we have T-code: ST13


to View the Process chain Display/View T-code: RSPC

Infopackage:
Infopackage is Data loading Scheduler from where You can execute your extraction of
data from Source system.

On the left side you see when the chain was executed and how it ended. On the right
side you see for every step if it ended successfully or not. As you can see the two first
steps were successfull and step Load Data of an infopackage failed. You can now check
the reason with context menu display messages or Process monitor. Display
messages displays the job log of the background job and messages created by the
request monitor. With Process monitor you get to the request monitor and see detailed

information why the loading failed. THe logs are stored in tables RSPCLOGCHAIN and
RSPCPROCESSLOG.

Data Transfer Process (DTP) :


The data transfer process (DTP) to transfer data within BI from one persistent object to
another object, in accordance with certain transformations and filters. The data transfer

process controls the distribution of data within BI.

ROLL UP:
When you rollup data for an InfoCube, the system first loads the new data into any
aggregates that exist in the InfoCube.

COMPRESSION:
Compression is used to save the memory space as well as improve the performance of
cube, when we load the data it goes to the F table & after compressing that data loading
request it will go to the E table, once we compress the request we cannot delete the
request directly from the cube. For deleting that request we should go for Selective
Deletion

Once all the DSR Process chain is completed we have to send a mail status to Team.

AT&T Mails:
If any chain is long running or failed we will receive a mail from the AT&T Team and we
need to reply them immediately with response.
Please find the mail Image below which we will receive for long run Process
chain

We need to response immediately like below mail and need to monitor the process chain
closely still completes and if it is taking more than 00:30 -- 01:00 Hour and need to
recheck with GDW team.

When the process chain is failed we will receive the alert mail need to reply the mail
please find the image below.

On the above mail we need to reply the mail

Once the Failed Process chain is completed we need to reply the mail to AT&T Team.

DSR Report:
DSR Reports are one of the Important and it has to be delivered in time and need to
monitor

Once all the DSR reports we need to send the status mail to the Business User Please
find the format mail below.

Note: I will show and explain all the related to Monitoring activities in the training
session.

BPC Meta Chain issue:


This is one of the important chain in the GDW monitoring PC Name:
ZPGDW_A2R_FID_BPC_META.
As this is the chain runs long time and we need to take the action please find the
process.
Step1:

Step 2: Right Click Process Monitor

Step3: Again goto Process monitor which is the local sub chain

Step 4: The below chain will run for a long time as we discussed we need to
monitor for below Infopackages check whether the records are fetching for the 4
Infopackages very closely once start fetching the records no issues some times
one or two infopackages wont fetch any records monitor for 10-15minutes if still
fetching any records please report to GDW teamand we need to take the action
immediately as we discussed in session.

If you get the access for the mentioned Infopackages are not fetching records we
need to kill the Jobs in the ECC system (EP1). Process to kill the request in ECC.
Step 1:
Please select the request from the BW side Right Click Header Request
Please find the image.

Once select the request go to ECC production system (EP1) and kill the job.
GOTO SM37 T-code Paste the request and execute

And double click on the request and goto Job details Select the Excecuting
Server PD number.

Once collecting the data please select the Application Server Select server
Name Double click on sever and enter the details select the PID number
Kill the job Cancel without Core.

Once done the above job will be cancelled.

DSR Lekage Amount :

Once all the DSR chains has completed we have check the lekage amount it
should be between in the range of -2000 to 2000. And some time if it cross
above -1400 Please update to the GDW and after confirmation Please send a
mail to business users. Path to run the report
Documents Public Folders Global Data Warehouse Sales and Profitability
Analysis DS- Sales DS-Sales Tracker.

Mail Format:

DSR Report:
Once the DSR loads are completed successfully we need to take the BO reports
the reports start time is 04:43:00 EST
Before report started we need to check one process chain:
ZPGDW_REPORT_SCHEDULE
please find the path below .
Url: http://w218826v22.attcnd.local:8090/BOE/BI
Step1: Login page:

Webi Page:

Please find the Path for the reports:


Click on documents

Once we click on the documents we have a Icon FOLDER below that we have
Public Folders Global Data Warehouse Sales and profitability Analysis
Daily Sales
All the reports are available in the Daily Sales

First report will start DS-CORP check whether the status in running or Failed or
Success
Right Click History check the status. If any status in the failed please
updated to GDW Team.

Time Based Reports :


We have a Time Based which will run at the timing

And we have three time based reports which are available in the Order Custom
Folder

Points to Remember:

GDW process chains run 24X7.


CCA report distribution happens on 7th of every month. Please check below
email.

FW CCA report distribution for 7th June 2015.msg (Command Line)

Always create one incident daily for monitoring activity and resolve this
once GDW report distribution completed. Take reference of incident
INC19152692.
For triggering next dependent job email: make sure you send this email
only once and wait for their reply. If you dont receive any reply then call
them. Never send email again.

BODS related Failure:


Below are the jobs name and action need to be taken by HCL.
1. GDW_BOD_GOODS_MOVEMENTS - Sent mail to AT&T asking to mark it as
complete and trigger the dependent jobs if any
2. GDW_BOD_MRP_ELEMENT - Sent mail to AT&T asking to mark it as complete
and trigger the dependent jobs if any
3. GDW_BOD_SPIN - Sent mail to AT&T asking to mark it as complete and
trigger the dependent jobs if any
4.

GDW_BOD_POS Check the error in BODS.

5. GDW_BOD_SIEBEL Check the error. Inform AMS-Siebel support that there is


some issue with Seibel loads in GDW. Inform the same to user Terri miller.

Important Links:

BODS: http://pr-bodsp-01:28080/DataServices/launch/logon.do
ID: admin
Password: admin

SAP Business Objects: http://w218826v22.attcnd.local:8090/BOE/BI


Your SAP ID and Password will work here.

ZPGDW_F2S_PP_MRPELEMENT- Runs on every Saturday.


ZPGDW_F2S_PLANNING_WEEKLY- Runs on every Saturday.

You might also like