You are on page 1of 49

1

2
3
4
Before going live, it is vital that tests are performed on productive data.
It is not sufficient to perform the tests on a copy of the productive source system – for the
following reason:
The online replication (in contrast to the Initial Load logic) works on raw posting information that
is captured and temporarily stored at the point in time when the document is posted. As long as
the Central Finance logic is not active in the productive source system, the raw posting
information does not get captured and is also not included in a copy of the system.
The following steps are recommended:
Work with a TEST source system and a Central Finance test system
1) Perform the initial load from a copy of the productive source system
2) Test the online replication from a copy of the productive source system. In your tests carry out
various processes that lead to financial and controlling documents
Work with a PRODUCTIVE source system and a Central Finance test system
3) Perform the initial load from the productive source system
4) Test the online replication from the productive source system. Live data is replicated to the Central
Finance system. The tests should run for at least a complete financial period so that all the typical kinds of
postings are part of the test.
GO LIVE: Work with a PRODUCTIVE source system and a PRODUCTIVE Central Finance system
5) Perform the initial load from the productive source system.
6) Switch on online replication from the source system.
You should also note that the initial load works completely differently to the online replication, as
existing data has to be restructured. In a proof-of-concept, the initial load should not be used to
demonstrate how well the online replication will work for the customer scenarios.

5
6
7
SAP ERP 6.0 systems and higher can be connected out of the box.
Several notes need to be applied in the source system to enable the systems to be connected
with Central Finance.
These notes introduce new database tables but do not change any business processes.
(further information can be found in the EKT material for technical consultants)

8
For replication of FI documents, the replication needs to be activated by company code.

To have starting balances in the Central Finance system, an initial load needs to be performed.
In the same Customizing view, the initial load is configured.
The initial load has two granularities:
-Take over of balances: for historic data where only a lower granularity (like in the FI totals
GLT0 or FAGLFLEXT) is required
-Take over of individual FI-documents: for a higher granularity, each FI posting is reposted

In this configuration, you can define:


-From which year balances shall be taken over
-From which period/year individual documents shall be taken over

GL Reconciliation Postings Transferred:


In this field you define whether GL reconciliation postings triggered in CO should be replicated
to the Central Finance system during initial load.
You should only set this flag if secondary costs are not transferred during initial load.

For further information, please refer to the chapter “Initial Load”.

If the initial load has been performed successfully (confirmed by the flag “Initial Load Finished”)
an extraction of data from the Central Finance system is no longer possible

9
10
11
Enabled by SAP Notes

2318183 - Central Finance: Wrong number of decimals for amounts in replicated documents
2325587 - Central Finance: Wrong number of decimals for amounts in replicated CO
documents

12
Define Decimal Places for Currencies in Source Systems
Use
In this activity you set the number of decimal places for currencies of the source system, if they are
defined differently than in the Central Finance system.
Requirements
The number of decimal places for currencies in both the source and Central Finance systems has been
maintained in the IMG activity Set Decimal Places for Currencies. This activity is usually part of the
general setup of the system.
You have compared the number of decimal places in all currencies in use in your systems and identified
any currencies with differing numbers of decimals.
Standard settings
In the standard setup this activity contains no settings. This means that all currencies are assumed to
have the same number of decimal places in source and Central Finance system.
Activities
For any currencies with differing numbers of decimal places, enter the number of decimal places as
defined in the source system. For currencies that have the same number of decimals in the source and
Central Finance systems, you do not need to make any entries.
Example
In a source system (logical system Q7QCLNT002), currency KWD (Kuwait-Dinar) is set to 2 decimal
places, while in the Central Finance system this currency is set to no decimal places.
Make the following entry:
Logical System Currency Decimals Q7QCLNT002 KWD 2
Note: Even though the definition of the logical system includes the client in the source system, the corresponding
setting is client-independent in the source system. This means, that the settings must be equal for different clients
of the same source system.

13
If the currency in the Central Finance has fewer decimals than the sender system, it can occur
that rounding differences have to be handled and distributed to other document items.

“Round half up” is applied if required


If last digit ≤ 4 then amount is rounded down
If last digit ≥ 5 then amount is rounded up

How is this rounding difference distributed?

The rounding difference is added to or subtracted from the first non-automatically generated
G/L account, material or asset account item depending on the +/- sign in the document. The
amount of a vendor line item or customer line item is only changed if special prerequisites are
met (the document contains, for example no G/L line item or only lines with small amounts)
because the previous application is supposed to display the same amounts for business
partners than the ones that are posted in Financial Accounting.

14
15
The present slide shows the typical system landscape of a Central Finance customer. On the
left side, you can see one or more source systems. These are connected via SLT to the Central
Finance system.
FI and CO documents are transferred in real time to the Central Finance system via interfaces.
Once in the Central Finance system, the FI and CO documents go through a business mapping
and then the accounting interface of the Central Finance system, where they are posted. In the
Central Finance system you can find all FI and CO documents in the ACDOCA table.

Customer can transfer data to the Central Finance system without mapping them. However, if
they want to use different or harmonized data in the Central Finance system, they might have to
map all their master data (customers, vendors, G/L accounts, cost centers, etc…) and
customizing data (payment terms, tax codes, etc…). This can be very time consuming.
The tool for mass handling of mappings in Central Finance, which is described in this
presentation, has been developed to make this activity easier and faster for SAP customers.

16
The Central Finance Mapping Tool allows mass handling of mappings in Central Finance.
The tool offers following functions:

 Generation of Excel (CSV) templates per mapping entity (S/4 HANA 1610 FPS0 key mapping
only)

 Mass upload of mapping values from Excel (CSV) templates (S/4 HANA 1610 FPS0 key mapping
only)

 Mass download of mapping values for checking purposes

 Display existing mapping values

 Mass deleting of mapping values (S/4 HANA 1610 FPS0 key mapping only)The tool performs
several checks when uploading (and deleting) mapping entries, in test run as well as in update run
mode:

 For example, in the CSV file it checks the format, it looks for duplicate entries and checks whether
there are 1:n mappings.

 During the upload, the mapping tool checks also if the new mapping values together with
the ones already in the system would result in duplicates or 1:n mappings.

- As additional service it checks also if the mapping values already in the Central Finance
system contain 1:n mappings.

17
Because the CSV file used for upload or deletion of mappings must have a specific format, we
offer the function “Generate Template“ to create an empty CSV file, which can be used for
preparing the mapping values.

You select a Mapping Entity, the function “Generate Template”, a file name for the CSV file, and
then execute the report.

The CSV template contains on column per mapping entity and context ID in the source and
target system.

18
Once you have prepared the CSV file with the mapping values you want to upload, you can use
it for uploading mappings into the Central Finance system.
You select a Mapping Entity, a Source System, the function “Upload Mappings”, a CSV file, and
then execute the report.
You can execute the report in test run mode first. After the report has run, the application log is
displayed automatically and shows information, warning and error messages.
In test run mode no update of the MDG tables takes place, but all checks that are executed in
the update run, are executed.

19
Once you are sure that the mapping values in the CSV file are correct, you execute the report in
update mode.
You select a Mapping Entity, a Source System, the function “Upload Mappings”, a CSV file, and
then execute the report.
After the report has run, the application log is displayed automatically and shows information,
warning and error messages.
If the CSV file contains some correct and some wrong mapping values, the correct ones are
uploaded to the Central Finance system, while the wrong ones are ignored. This is the standard
MDG logic.

20
You can use the „Display Mappings“ functionality to display a list of mappings for a specific
mapping entity and, optionally, a source system. If you don‘t select any source system, all
mapping values for the mapping entity across all systems are displayed.
You select a Mapping Entity, a Source System (optional), the function “Display Mappings”, and
then execute the report.
In the next screen, you can use the usual ALV functions for searching, sorting, filtering and
downloading mapping values.

21
You can use the „Download Mappings“ functionality to download a list of mappings for a specific
mapping entity and, optionally, a source system. If you don‘t select any source system, all
mapping values for the mapping entity across all systems are downloaded.
You select a Mapping Entity, a Source System (optional), the function “Download Mappings”, a
file name for the CSV file, and then execute the report.

22
Should you realize that some mapping values are wrong, you can use the mapping tool to
delete them from the MDG tables.
You select a Mapping Entity, a Source System, the function “Delete Mappings”, a CSV file
(optional), and then execute the report.
• You can delete selected mapping values at once: in this case, you need to prepare a CSV
file containing the values to be deleted and select it before executing the report.
• If you would like to delete all mapping values for a certain mapping entity, then you don‘t
select any file.
After the report has run, the application log is displayed automatically and shows information,
warning and error messages.

23
If SAP MDG is not used by the customer, the mapping needs to be maintained manually in a
user interface.

There are two kinds of mapping entities:


-Keys: these are the master data that are typically created every day distributed throughout the
system landscape, such as customer, material, vendor, GL account.
-Values: these are rather like customizing and more stable. They are often used within master
data that is distributed (e.g. a customer is assigned to a certain dunning areas, payment terms
etc.).

24
25
26
27
28
29
For FI documents, the document header table and its sub tables (BKPF, BSEG etc.) are not
sufficient. Since some other information needed for document posting is no longer available
once the document is posted. However, to properly post document into the Central Finance
system, the information is required.

That’s why the information needed for document posting is stored in tables (with CFIN_ACCHD
being the header table). This information is replicated via SAP LT Replication Server to the
Central Finance system.
A cleanup-program RFIN_CFIN_CLEANUP, which needs to be scheduled regularly (for
example, once each month), can delete the temporary information from tables .

In the configuration, you can define how many periods a temporarily stored data record shall be
kept (e.g. for being able to correct incorrect postings) before it is deleted by the cleanup-
program.

The new tables (such as CFIN_ACCHD) come with certain support packages or with note
2111634 for systems that are still in maintenance.
For older SAP ERP releases and Non-SAP-systems the integration must be done in the
customer project.

30
Initial Load - Transfer FI Documents to Central Finance System
This step of the initial load is a prerequisite for productive use. It transfers the FI documents to
the Central Finance system in intermediate database tables. This is a prerequisite for the
second step.
It also populates the characteristics database table of the profitability analysis in the source
system.

For replication of FI documents, the replication needs to be activated by company code.

To have starting balances in the Central Finance system, an initial load needs to be performed.
In the same Customizing view, the initial load is configured.
The initial load has two granularities:
-Take over of balances: for historic data where only a lower granularity (like in the FI totals
GLT0 or FAGLFLEXT) is required
-Take over of individual FI-documents: for a higher granularity, each FI posting is reposted

In this configuration, you can define:


-From which year balances shall be taken over
-From which period/year individual documents

31
32
In this section you define one migration clearing account for each company code for which
postings are to be loaded into the Central Finance system.
For each reconciliation account you must define one substitution account.
In the first step of the initial load, all balances related to reconciliation accounts are transferred
to their assigned substitution accounts. In a second step, open items are posted to the
reconciliation accounts, while the offsetting entries are posted to the substitution account.

33
34
35
Define Initial Load Groups
Use
In this Customizing activity you define initial load groups. You use initial load groups to separate
the initial load in Central Finance into groups to which you have assigned company codes.
Requirements
You can carry out the initial load in the following ways:
To execute the initial load for all maintained source systems and company codes at the same time
you do not need to use initial load groups.
To execute the initial load for a group of company codes, you must first define initial load groups.
It is not possible to mix these methods. If you execute an initial load using one method and
afterwards decide you want to use the alternative method you must first delete the initial load data.
For more information, see the activity Delete Initial Load Data
Activities
1. Maintain the ID and description of the initial load group.
2. Assign combinations of logical systems and company codes to the initial load group that you
have created.
Note that a combination of logical system and company code can only be assigned to one initial
load group at a time.
You can now trigger a simulation of the initial load or an initial load data extraction for one initial
load group.

36
With the upfront definition of Initial Load Groups the Extraction and Execution of the Initial Load
can be performed separately. This enables teams focusing on different companies to work in
parallel on the initial setup of Central Finance.

All relevant steps of the Initial Load can be done per Initial Load Group:

• Execute Initial Load


• Mapping Simulation
• Posting Simulation
• Posting

• Monitor Initial Load Execution


• Monitor Simulation of Mapping
• Monitor Simulation of Posting
• Monitor Posting

• Delete Initial Load Data

The Extraction Step can not be executed per Initial Load Group. Instead here the extraction is
executed against all maintened logical source systems and those company codes, which have
the „Configuration Settings in Source System“ being made.

Also activity „Compare Actual and Expected CO Postings in Central Finance” does not use the
Initial Load Group concept, because here it is possible to restrict the selection by Logical
System of the Sender and company code.

37
Simulate Mapping
This step of the initial load is optional. It can be used to find problems in mapping before the actual
posting is carried out.
The output of the report is a list of messages, which is stored in the application log.
Dependencies
The Initial Load step Extract Data should be finished before you start this simulation step. However it
is not mandatory that all packages have been extracted successfully.
Features
This step helps to identify mapping errors before you execute the posting step of the initial load. The
simulate mapping run is much quicker than the simulate posting run.
You can display the status of this migration step with a dedicated monitoring function.
Simulate Posting
This step of the initial load is optional. It should help to find missing customizing and master data
before the actual posting is performed. Most of the checks, which are executed during the posting run,
are also executed with the simulate posting.
The output of the report is a list of messages, which is stored in the application log.
Dependencies
The Initial Load step Extract Data should be finished before you start this simulation step. However, it
is not mandatory that all packages are extracted successfully.
Features
This migration step is optional to finish the Initial Load. It helps finding problems before the actual
posting run.
You can display the status of this migration step with a dedicated monitoring function.

38
39
Some management accounting (CO) postings requires additional data for document replication.
These data must to be passed to Central Finance system together with source document.
Customizing activity ‘Prepare for and Monitor the Initial Load of Management Accounting
Postings’ shall be used for this purpose.
These data include:
 CO-PA line items and characteristics are converted as key-value pair structure and stored in
CFIN_COPA table
 Storing references of the original documents in CFIN_CO_ADD table for reposting
documents using the RKU3 business transaction.
 Storing additional attributes in CFIN_CO_ADD table to generate the CO key subnumber
(HRKFT) field in COEP table.
Once all configuration and preparation activities are finished. A smoke test can be performed for
cost object mapping and CO document replication. It should be helpful to find wrong
configurations.
Note: This tool should be distinguished from the activity Simulation for Initial Load, which is
used for simulation of initial load data with high data volume. This activity is not intended for
initial load and handles a relatively small number of records selected by user for smoke test
purpose.
Simulation of initial load shall be performed too. It can help to identify wrong configurations in
Central Finance system without generate real data.
Simulation of initial load for Management Accounting document and Cost Object Mapping are
both triggered in SLT system. Simulation results can be checked in AIF of Central Finance
system. There are more details in the documentation of corresponding customizing activities.

40
41
With this tool, you can simulate cost object replication and mapping by executing the necessary
checks, without actually creating the cost object mapping or posting the CO document. This
tool shall be used at the early phase once you finished configurations in central system. It can
help you to find missing Customizing and master data before the actual transaction posting
takes place.

42
43
44
45
46
Correct Cost Object Mapping
Sometimes, after the replication of cost objects from source system to target system happens,
you might want to change the mapping rules for Cost Object Mapping Scenarios.
After the change takes place, the already-replicated cost objects need to be replicated and
mapped again according to the new mapping rules.

Delete Cost Object Mapping and Cost Objects


Sometimes, after the replication of cost objects from source system to target system, you need
to clean up all replicated cost object and also the mappings. In this Customizing activity, you
can clean up and delete replicated cost objects and mappings. Then you can do replication
again. This activity only deletes the cost objects; it does not delete the master data and
transactional data that refer to the cost objects

47
The summary allows the learner to recap what they learned during the lesson.
It should be more than a listing of the objectives of the lesson. It should be result oriented and
review the main learning points of the lesson.

48
49

You might also like