Professional Documents
Culture Documents
Table of Contents
Steps for Data Import in S&OP On-Premise using ETL .......................................... 1
Data Import Fundamentals ...................................................................................................... 2
Master Data Upload Steps ......................................................................................................... 3
Key Figure Upload ....................................................................................................................... 5
1. Batch ID: This has to be unique and cannot be reused. One can either use sequence
delivered by S&OP or use the built in Hana function SYSUUID
a. SELECT SAP_SFND."sap.sop.sopfnd.catalogue::SOPINTEG_BATCH".NEXTVAL AS
BATCH FROM DUMMY;
b. SELECT SYSUUID FROM DUMMY;
2. A Batch can contain multiple files. For e.g. you might want to upload product master,
customer master, product customer master together. In that case, you would have
multiple files in one batch. Remember file name has to be unique within one BATCH
3. BATCH PAYLOAD: Actual Master Data or Key Figure data. Typically this will be directly
inserted into respective staging tables.
Note: It is assumed that any port required to be open, user access to required S&OP HANA
tables has already been done.
4. Insert data in STAG table of the target master data type for your BATCH,FILENAME.
Remember the columns you populated (example: CUSTID, CUSTREGION ).
Remember to find the staging table; you can use the following query:
SELECT STAGINGTABNAME FROM
SAP_SFND."sap.sop.sopfnd.catalogue::SOPDM_PLOBJTYPE" WHERE ACTIVE = 'A' AND
PLOBJTYPE = <target master data type>;
For S2CUSTOMER, you must populate the table SAPSOPG.SOPMD_STAG_S2CUSTOMER
with the data and batch and file from steps (1) and (2)
5. Create batch meta data for your BATCH and FILENAME by calling the following
procedure
CALL SAP_SFND."sap.sop.sopfnd.catalogue::SOPINTEG_DATA_UPLOAD
"(
'a76d45f19a8511e3a3682a6aa700f01e', -- BATCH_OUTSIDE: Batch ID
'SM1PRODUCT_20140428', -- BATCH_NAME_OUTSIDE: Batch name
CURRENT_UTCTIMESTAMP, -- BATCH_TIMESTAMP_OUTSIDE: Batch timestamp in UTC
CURRENT_UTCTIMESTAMP, -- BATCH_UPLOAD_TIMESTAMP_OUTSIDE: UTC (marks batch start) <<
so this should be the UTC timestamp when it started putting data records in staging area
'INSERT_UPDATE', -- COMMAND_OUTSIDE: INSERT_UPDATE or DELETE (or REPLACE only for Master
Data)
'', -- PLANAREA_OUTSIDE : PLAREA - required only for key figure data, otherwise pass empty string
'', -- SCENARIO_OUTSIDE : SCNID - optional; required only for key figure data to a scenario other
than Baseline, otherwise pass empty string
-1, -- TIMEPROFILE_OUTSIDE : TPID - required only when submitting time period data, otherwise
pass -1
S2CUSTOMER. CUSTID; S2CUSTOMER.CUSTREGION' -- COLUMNLIST_OUTSIDE : semicolon
separated list of columns in files : FILENAME.COLUMN_NAME (required)
);
6. Once batch is uploaded with Meta data and master data values, the S&OP Data
integration job which runs in the background will automatically process the batch. No
PROCESSED
Fatal Error
Some records were processed
successfully but there are errors with
some other records
All records are processed successfully.
You can use this Batch status to schedule Data flows: For e.g. Data flow 1 might consist
of uploading master data S2CUSTOMER, Data flow 2 might consist of uploading Key
Figure data dependent on S2CUSTOMER. So, you can configure Data flow 2 based on
the batch status of Data flow 1 in data services.
4. Insert data in Key Figure staging table of the target planning area for your
BATCH,FILENAME while supplying a unique ID
Remember the columns you populated (example: PRDID, CUSTID, KEYFIGUREDATE).
Key figure staging table has one primary key column: ID: NVARCHAR(100)
Since it is the primary key, it needs to be populated with a unique value. Any one of the
following options could be used to populate the "ID" column with a unique value:
Invoke SAP delivered sequence or populate it with your own unique GUID.
-- Use sequence to populate "ID" column of key figure staging table
SELECT SAP_SFND."sap.sop.sopfnd.catalogue::SOPINTEG_KF_SEQ".NEXTVAL AS ID
FROM DUMMY;
-- Use HANA SYSUUID to populate "ID" column of key figure staging table
SELECT SYSUUID AS ID FROM DUMMY;
Remember to find the staging table; you can use the following query:
Key Figure Data [Baseline]
SELECT STAGINGKFTABNAME FROM
SAP_SFND."sap.sop.sopfnd.catalogue::SOPDM_PLANAREASET" WHERE ACTIVE = 'A'
AND DEFAULT = 'X' AND ISSCENARIO = 0 AND PLAREA = <target planning area>;
For CONSENSUSDEMAND KF , populate the table
SAPSOPG.SOPDD_STAGING_KFTAB_S2BASE with the data and batch and file, ID from
steps (1) and (2)
5. Create batch meta data for your BATCH,FILENAME
CALL SAP_SFND."sap.sop.sopfnd.catalogue::SOPINTEG_DATA_UPLOAD"
(
a76d45f19a8511e3a3682a6aa700f01f, -- BATCH_OUTSIDE: Batch ID
6. Once batch is uploaded with Meta data and payload, the S&OP Data integration job
which runs in the background (CRON job) will automatically process the batch. No step
is needed from Data integration or no intervention is required in this step.
7. Now you can check the status of the batch using the following query:
-- Monitor BATCH processing
SELECT STATUS FROM SAPSOPG.SOPINTEG_BATCH_STATUS WHERE BATCH =
a76d45f19a8511e3a3682a6aa700f01f; -- PROCESSED, PROCESSED_WITH_ERROR,
ERROR
Status means the following:
ERROR
PROCESSED_WITH_ERROR
PROCESSED
Fatal Error
Some records were processed
successfully but there are errors with
some other records
All records are processed successfully.
You can use this Batch status to schedule Data flows using 3rd party ETL
Step 3:
Provide filter criteria.
Do not specify a filter if you want to extract all data but be cautious that there can be a LOT
of data. So try to use a filter!
Notice that the time interval filter is specified relative to current period. This example
extracts key figure data for current and previous time buckets at the respective time level.
Also observe that the time interval filter is specified for the same time level at which key
figure data is to be extract.
In this example, key figure data is being extracted at time level PERIODID3, and therefore,
time interval filter is also specified at time level PERIODID3. This is important.
Step 4:
Then call this stored procedure. It returns an SQL statement and a return code.
CALL sap_sfnd."sap.sop.sopfnd.catalogue::SOPAPI_CONSTRUCT_QUERY_SCENARIO"
(
'"SAPSOPG"."SOPHS2HS2"', -- calculation scenario << notice the presence of doublequotes; this is the calculation scenario for the scenario from which you want to extract key
figure data
'HS2', -- planning area
-1, -- limit: SQL paging; value -1 means no paging
Step 5:
If return code was 0 (meaning success), then execute the SQL statement returned by step 4
and you should get the key figure data extract.