You are on page 1of 18

Loading data from Legacy to

SAP ECC using SAP BOBJ Data


Services via IDOC/BAPI/RFC
Whitepaper

May 2012
Rajnesh Sharma
Parneet Singh Soni
Deloitte Consulting LLP

Introduction
The SAP ECC system does not possess any ETL capabilities. It cannot connect to a remote source
and extract the data for processing and loading. Until the time it acquires this capability it has to rely
on third-party tools to extract the data for itself. And Business Objects Data Services (BODS) is the
tool of choice. This is because it provides direct connectivity to SAP. Data can be directly loaded into
SAP through various means like Intermediate Document (IDOC), Remote Function Call (RFC), and
Business Application Programming Interface (BAPI), etc. This capability of BODS leverages it to be
used for transferring data into SAP after performing ET operation on data. With the acquisition of
Business Objects by SAP, the Data Services are becoming fast and effective tool to load data into
SAP.

What are IDOCs


IDOC is a standard data structure for Electronic Data interchange (EDI) between application
programs written for the popular SAP business system or between an SAP application and an
external program. IDOC is nothing but a data container, which serves as a vehicle to transfer data in
SAP's Application Link Enabling (ALE) system. IDOC is Asynchronous in nature i.e., it does not send
a status return at the time of processing. In order to observe the status of an IDOC, we may have to
run a job to fetch IDOC statuses back to user (more on that later)

What is BAPI/RFC
BAPI is Business Application Programming Interface, which is used to upload data into SAP system.
BAPI is commonly known as function module that is normally RFC enabled as well and acts as a
method of a business object. A BAPI is remotely enabled function module i.e., it can be invoked from
remote programs like standalone JAVA programs, Web interface, etc.
A RFC is a call to a function module running in a different system from the caller's. The remote
function can also be called from within the same system (as a remote call), but usually caller and
callee will be in different systems.
In the SAP System, the ability to call remote functions is provided by the RFC. RFC allows for
remote calls between two SAP Systems (R/3 or R/2), or between an SAP System and a non-SAP
System

First Step:
SAP Datastore creation using BODS Designer
Open the DS Designer
Click on Create Datastore in the Getting Started section

A Create New Datastore tab will open.


Give the Datastore name as any name which you want to give e.g., DS_SAP_ABC

Select the SAP Applications from the Datastore Type list


When you select SAP Applications, some new sections will appear.

Click on Advanced button at the bottom left side of Create New Datastore tab.

Click on Edit Configuration button.

Configuration options tab will appear.

Fill the specified information.


In the Connection section:
1. If there is more than one configuration then select which one you want to be default. To make
any configuration the default one choose Yes
2.

Give the User Name to access the SAP Application server.

3.

Give password to access the SAP Application server.

4.

Give the SAP Application server name.

In the Locale section:


1. Select EN-English as the language.
2.

Select utf-8 as code page.

In the SAP section:


1. For ABAP execution option, select Generate and Execute.
2. Write SAP client number < Need SAP teams help for this info>.
3. Write SAP system number < Need SAP teams help for this info>.
4. For Data Transfer Method, select Direct download.
5. For Working directory on SAP server, select the SAP working directory location.
6. For Local directory, select the SAP local directory location.
7. For Generated ABAP directory, select SAP local directory location.
8. Keep default values for all the remaining fields.
4

In the Upload Attributes section:


Keep default values for all the fields.
Click OK.

Configuration will be created.

Click Apply button and the Ok button.

Datastore will be created.


Go to Datastore tab in the bottom left corner of the Local Object Library to see the newly created
Datastore.

Expand the Datastore by clicking the + before the Datastore.

After expanding, you will see different type of objects, which can be used in the BODS jobs as
Sources, as well as targets. These are Extractors, Functions, Hierarchies, IDOCs, and Tables. Out
of these, we will only work here on Functions and IDOCs.

Second Step:
Importing BAPI, RFC, and IDOCs from the SAP
For importing BAPI and RFC into Data Services, Go to Function in the DS_SAP_ABC Datastore.
Right click on Function and then click on Import By Name.

A new pop-up will open. Write the name of the RFC or BAPI you want to import and click Import
button.

You can see the imported RFC or BAPI under Functions section.

For importing IDOCs into Data Services, Go to IDOCs in the DS_SAP_ABC Datastore. Right click
on IDOCs and then click on Import By Name.

A new pop-up will open. Write the name of the IDOC you want to import and click Import button.

You can see the imported IDOC under IDOCs section.

Loading data to SAP through IDOC


IDOCs are SAP's way to exchange messages between systems. So you can setup an IDOC to
inform all other SAP systems about a new material, thus it can be used everywhere immediately.
Data Services (DS) is just another remote system that can send or receive IDOCs. Therefore, you
can use DS to build an online reporting system by receiving change messages from SAP, keep SAP
in sync with other systems, and write batch jobs to send multiple IDOCs.

We will explain the process of loading data into SAP via IDOC with an example. Let us take a
sample IDOC used for loading Material master data. The IDOC type is MATMAS_MASS_BAPI01.
Below is a snapshot of what could be a dataflow for a job to load data into SAP via IDOC.
The Target Table is the table where your data comes from. It could be a flat file or an excel file as
well. But why is the data split into multiple query transforms? And what is the significance of
EDI_DC40. Looking at the IDOC structure will explain matters better.

10

Above is the structure of the IDOC specific to MATMAS_MASS_BAPI01. The IDOC is structured
into various segments with each segment catering to specific part of SAP functionality with regards
to the IDOC. As per our requirement, these fields need to be mapped. However, we cannot map the
fields directly to the IDOC. The IDOC structure needs to be replicated in a query transform. This can
be simply done by connecting the output of a query transform to the IDOC. The entire IDOC
structure then replicates itself in the output schema of that Query Transform.
One Segment that is common to the IDOCs is EDI_DC40. It handles the control record for every
IDOC.
The control record is stored in table EDIDC. It contains control information relating to supporting
configuration including, but not confined to: IDOC number, Direction of transmission, Sender, and
Receiver information: Port and Partner, Message Type, IDOC Type, etc. Hence the Query
Transform EDIDC_40 as shown earlier. The control record values can be hardcoded in that query
transform and the Row Generator helps generate one row of control records for each IDOC
generated as should be the case.
Before understanding the segment structure of the IDOC in detail there are some values (indicated
by the arrow in the picture above) that need to be filled up. However, you will need help of the SAP
people to fill it up
Partner Number: <Connect with your SAP team to get this info>
Partner Type: <Connect with your SAP team to get this info>
Message Type: <IDOC name>
Batch Size: <It determines the number of IDOCs it will send before taking a timeout for a time as
mentioned in Batch Wait Timeout.>
Application Server: <Same server which is used for creation of SAP Datastore>
The code may not need all the segments that an IDOC has to offer and so we may map only the
required segments and delete the rest in the query transform as shown.
11

Understanding the IDOC structure is critical for mapping the fields of the IDOC. You need to have an
understanding of the header detail structure of the data if it exists. The header segments can
process only one record per IDOC while the detail record segments can handle multiple records per
header.
It is easier to handle IDOC segment mappings if an independent query transform were made to
handle single segment.
Tip:

Maintain the keys of the records in every query transform so that data can be joined later at the
segment level.

Thus it becomes important to define the joins and also the From section correctly. They are defined
by first making the desired segment current (right click make current) and then using the from and
where tabs in the query transform as shown.

12

Map all the required segments from the input schema and the fields, which are not required can be
mapped as null.
Validate the job and execute. After successful job execution, you may check WE05 transaction in
SAP to see whether the IDOC was successfully posted or not along with the data.

The green dot indicates the IDOC posting was successful. In case of a red, check the error message
in SAP (trl slg1 for a better description). Among the many reasons of failure could be some
parameter not configured in SAP or some mandatory value field in a segment not getting passed.
Check and correct the mapping correspondingly

13

Loading data to SAP through BAPI/RFC:


The preliminary steps involved for loading data into SAP through RFC are the same and they involve
setting up the Datastore if not configured and importing the Function. A dataflow design for loading
data through RFC is as below

The main point of focus is the query transform Q_BAPI wherein the BAPI is called. The steps to
map a BAPI function inside a query transform are as follows.
Inside a query transform right click on the segment and click on New Function Call.

14

The Define Input Parameter box pops up which lets you to map fields in the input schema of the
query transform to the BAPI fields

Once done with the mapping, click on next to choose the output to be returned by the BAPI

15

Click Finish.
In subsequent query transforms you may unnest the return table ( right click on the return segment
and click unnest) and then map it to a target table.
The same procedure is to be followed for mapping an RFC.
While execution, for every record or a batch of records that passes through the RFC the return
message gets populated in the table which tells whether the record got posted or there was an error.

Conclusion
This document will be very helpful for the people who are working on the conversion of legacy data
into SAP. This document explains the detailed description of how to create the SAP Datastore in
BODS which acts as a connector between BODS and SAP. With the help of Datastore we can
import IDOCs, BAPIs, RFCs, and Tables from SAP into BODS Designer and use them in the BODS
jobs for implementation of the required logic.
Also this document explains with the help of an example the steps used for connecting the IDOCs
and BAPIs with the legacy data and loading into SAP.

16

Rajnesh Sharma is a Consultant with Deloitte Consulting LLP and based in the Mumbai office. Has five years
of experience in Datawarehousing and Business Intelligence project for different clients.
He can be reached at mailto: rajnsharma@deloitte.com.
Parneet Singh Soni is a BTA with Deloitte Consulting LLP and based in Mumbai office. Has nearly a year of
experience in Datawarehousing
He can be reached at mailto: pasoni@deloitte.com.

This publication contains general information only and is based on the experiences and research of Deloitte practitioners. Deloitte is not,
by means of this publication, rendering business, financial, investment, or other professional advice or services. This publication is not a
substitute for such professional advice or services, nor should it be used as a basis for any decision or action that may affect your
business. Before making any decision or taking any action that may affect your business, you should consult a qualified professional
advisor. Deloitte, its affiliates, and related entities shall not be responsible for any loss sustained by any person who relies on this
publication.
As used in this document, Deloitte means Deloitte Consulting LLP, a subsidiary of Deloitte LLP. Please see www.deloitte.com/us/about
for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services may not be available to attest clients
under the rules and regulations of public accounting.
Copyright 2012 Deloitte Development LLC. All rights reserved.
Member of Deloitte Touche Tohmatsu Limited

You might also like