You are on page 1of 19

Data Migration Strategy

Birmingham City Council


Customer First Transformation Programme

Version 1.0

14 th January 2008

Author

Date:

Approver

Date:

EXAMPLE: Data Migration Strategy (Customer First)


Version 1.0 Date 24/03/2010

Table of Contents
1 Document Information.............................................................................................4
2 Introduction..............................................................................................................5
3 Definitions of Different Types of Data......................................................................6
3.1
3.2
3.3

Master Data...................................................................................................................6
Configuration Data.........................................................................................................6
Transaction Data........................................................................................................... 6

4 Data Migration Activities..........................................................................................7


4.1
4.2
4.3
4.4
4.5
4.6
4.7
4.8
4.9
4.10
4.11
4.12
4.13
4.14
4.15
4.16
4.17
4.18
4.19
4.20
4.21
4.22

Define the SAP Data Requirements (Functional)...........................................................7


Define the SAP Data Requirements (Technical)............................................................7
Identify the Legacy Data (Functional)............................................................................7
Identify the Legacy Data (Technical)..............................................................................7
Define Data Standards..................................................................................................7
Legacy Data Cleansing..................................................................................................7
Determine the Data Transfer Method.............................................................................7
Data Mapping and Transformation.................................................................................8
Identify Missing Data.....................................................................................................8
Resolve Missing Data....................................................................................................8
Extract Legacy Data......................................................................................................8
Design Automatic Loads................................................................................................8
Develop Automatic Loads..............................................................................................8
Manual Data Entry.........................................................................................................9
Data Loading Instructions..............................................................................................9
Trial Data Upload - Running..........................................................................................9
Trial Data Upload - Checking.........................................................................................9
Execution Plan for Final Uploads...................................................................................9
Dual Data Maintenance.................................................................................................9
Data Cutover.................................................................................................................9
Reconcile the Migrated Data.........................................................................................9
Data Sign-off..................................................................................................................9

5 Data Migration Guiding Principles.........................................................................10


5.1

Data Migration Approach.............................................................................................10

5.1.1

Master Data - (e.g. Customers, Assets)..................................................................................10

5.1.2

Open Transactional data (e.g. Service Tickets).....................................................................10


5.1.2.1

5.2
5.3

Historical Master and Transactional data....................................................10

Data Migration Testing Cycles......................................................................................11


Data Cleansing............................................................................................................11

6 Data Migration Methods........................................................................................13


6.1
6.2
6.3

Legacy System Migration Workbench (LSMW)............................................................13


LSMW process flow.....................................................................................................13
Bespoke Program Development..................................................................................14

Data Migration Strategy250297826.doc


06/11/2014

page 2 of 19

6.4

Manual Migration......................................................................................................... 14

7 Data Object Structure (DOS)................................................................................15


8 Data Object Register.............................................................................................16
9 Roles and Responsibilities....................................................................................17
10 Key Issues & Risks................................................................................................18
10.1 Current Data Migration Issues.....................................................................................18
10.2 General Data Migration Risks......................................................................................18

Data Migration Strategy250297826.doc


06/11/2014

page 3 of 19

Document Information

Area

Data

Title

Customer First Data Migration Strategy

Business
Owner

Process

Short description

SolMan Dev ID
Reviewed by
Approved by
Document Status

Ready for review

Comments

Version

Date

Summary of Changes

Author

1.0

14.01.2008

Document ready for review

Mushtaq khan / Graeme Cox

Data Migration Strategy250297826.doc


06/11/2014

page 4 of 19

Introduction

The scope of this document is to define the data migration strategy in the context of Customer First
from a CRM perspective. By its very nature CRM is not a wholesale replacement of legacy systems with
SAP but rather the coordination and management of customer interaction within the existing application
landscape. Therefore a large scale data migration in the traditional sense is not required, only a select
few data entities will need to be migrated into CRM.
Data migration is typically a one-off activity prior to go-live. Any ongoing data loads required on a
frequent or ad-hoc basis are considered to be interfaces, and are not part of the data migration scope.
This document outlines how the CF project intend to manage the data migration from the various
council legacy systems the SAP CRM system
The creation of the Single Customer Record is not included in this document; refer to the Single
Customer Record MDM design paper for information on this subject.

Data Migration Strategy250297826.doc


06/11/2014

page 5 of 19

Definitions of Different Types of Data

In the context of this document legacy applications are defined as those that will be replaced by the
Customer First Transformation programme, back-office applications are defined as applications that will
not be replaced but may be integrated into the Customer First applications.

3.1

Master Data

Identified as fixed data, it describes the information about people, places and objects that are involved
in running the business processes. These data types tend to be created once maintained over a long
time frame and are used by a number of business activities. Examples include; customers, assets, land
& property.

3.2

Configuration Data

This is data that is set up on SAP during the build and configuration process as it is configured. This
type of data is not part of the migration process, as it will be transported to the production system
through the transport procedure along with all other configuration .

3.3

Transaction Data

Transaction data describes business activities conducted by the council in carrying out its duties. In the
context of Customer First these transactions will be related to the customer interaction process and
related to;

Customer providing new information such as new address details.


Customer seeking information such as location of nearest swimming pool.
Customer making an application such as an application for a parking permit.
Customer making a booking such as an appointment with a council officer.
Customer raising a service request such as a bulky waste collection.
Customer making a payment such as a traffic offence fine.

Transactional Data falls into two categories;

Open Transaction Data - is transactional data that has not completed its business cycle, for
example a service ticket that remains open with additional activities required prior to being
closed.
Closed Transactional Data is transactional data that has completed its business cycle and is
subsequently used for information purposes only, for example a service ticket will all related
activities completed and a ticket status of closed.

Data Migration Strategy250297826.doc


06/11/2014

page 6 of 19

Data Migration Activities

The complexity of data migration demands a process that ensures the accurate and complete transfer
of data into the new system from legacy systems. The activities involved in the data migration process
are detailed below.

4.1

Define the SAP Data Requirements (Functional)

In SAP a Data Object is a business data unit such as customers; the functional SAP data requirements
define the SAP transactions, the screen headings and the screen field descriptions used to enter the
data.

4.2

Define the SAP Data Requirements (Technical)

These define in detail the SAP Data structures, tables and fields. Including the field name, field
attributes (e.g. data type and length) and field properties (mandatory, optional, conditional or
suppressed). This process requires a detailed knowledge of the associated SAP business processes
and careful analysis of the configured SAP system.

4.3

Identify the Legacy Data (Functional)

This activity identifies where the legacy data currently resides, in which applications/databases, and
how it is currently entered and maintained. In addition this activity defines the legacy screen headings
and the screen field descriptions used to enter the data.

4.4

Identify the Legacy Data (Technical)

This activity identifies in detail the legacy data tables and fields, the field attributes and properties
including the data type and length. This process requires a detailed knowledge of the associated
legacy applications.

4.5

Define Data Standards

The data standards define the required appearance, consistency and content of the data. For example,
the name and address formats, uppercase & lowercase requirements, providing a consistent look for
the data that will be visible both internally and externally to the business.

4.6

Legacy Data Cleansing

Legacy data required for migration into SAP must be completely cleansed prior to the final data load in
SAP with the aim to ensure the consistency and accuracy of the data.
As a general approach the data will be cleansed on the legacy database before extraction, however
there may be circumstances where this is not the best method and the data will be cleansed after it has
been extracted.
Cleansing is an iterative process that can start as soon as the data has been identified as being
required for migration into SAP. The data cleansing cycle includes the following steps:
The elimination obsolete records.
The removal of duplicate records.
Correcting inaccurate records.
Correcting incomplete records.

4.7

Determine the Data Transfer Method

For each business object the following choices are available for data transfer:
To use the standard SAP data transfer programs.
To manually enter data with online transactions.
To develop bespoke batch input programs.

Data Migration Strategy250297826.doc


06/11/2014

page 7 of 19

Data volumes, data complexity and availability of standard SAP data load programs are all to be
considered before deciding between automated or manual load.

4.8

Data Mapping and Transformation

This is a manual process where the data fields in the legacy data source are assigned corresponding
fields in the SAP system. Field text in the legacy system rarely agrees with the corresponding
terminology in the SAP system, therefore a variety of mapping methods are required. At the end of this
step every SAP field that requires data must have been:
Assigned a corresponding field from the legacy system.
Assigned a transformation i.e. converted from the original state to the required state using
variety of methods including lookup tables, combining fields or logical rules for data
transformation.
Assigned a constant value.

4.9

Identify Missing Data

Missing data is identified during the data mapping process, where a table or field in the SAP systems
does not have an identifiable source from the legacy applications.

4.10 Resolve Missing Data


SAP modules will require data that does not exist in the legacy systems, the approaches to resolve this
issue are:
Populate the missing data within the data load programs, either by calculation or mapping
tables.
Using third party software, such as Microsoft Excel, for data staging and manipulating the
data manually to fit the required format.
Developing guidelines for use during manual migrations, which show how to determine
missing values while entering the data.

4.11 Extract Legacy Data


Legacy data is extracted from the legacy systems using software specifically developed for the task
using the legacy toolset available, the extracted data is loaded into staging tables and transferred to the
data staging area. Extracted legacy data will be held on the data migration staging area for further
analysis, manipulation and cleansing using the appropriate tool, for example MS Excel, MS Access.
Integrity checks are required to ensure the correct data has been extracted, for example, all records are
included in the extract, duplicates have not been created etc.

4.12 Design Automatic Loads


Where an automatic load process has been chosen as the best method to load the data into SAP, the
data load programs are designed and a technical specification written. The technical specification will
define an appropriate SAP load technique, all data mapping (legacy field to SAP field) and data
transformation logic.

4.13 Develop Automatic Loads


Automatic data loads into SAP will be developed using the technical specification defined.

4.14 Manual Data Entry


Where manual data entry has been chosen as the best method to load the data into SAP, data will be
manually entered into SAP using the appropriate transaction.

4.15 Data Loading Instructions


Instructions for the data load to be written and agreed for all data objects in scope. This includes the
order in which data is to be migrated and any dependencies

Data Migration Strategy250297826.doc


06/11/2014

page 8 of 19

4.16 Trial Data Upload - Running


The process of running trial data loads into the SAP system will help ensure data accuracy, the correct
load sequence, determine load duration and correct any loading errors. The trial data loads will
continue until the load process completes successfully.

4.17 Trial Data Upload - Checking


After the trial data load run has completed, the data loaded will be checked for accuracy to ensure it
meets defined requirements.

4.18 Execution Plan for Final Uploads


This plan gives an indication of the feasibility of the final conversion into the production system in the
available time frame, and specifies the data sequencing as well as the appropriate time to freeze the
legacy system(s), for extracting information for the final uploads.

4.19 Dual Data Maintenance


Where master data is migrated before actual go-live any data changes on the legacy system must also
be reflected on the new SAP system. Dual maintenance is best avoided if there is a large volume of
master data changes as the additional workload may be excessive.

4.20 Data Cutover


This is the final Data Migration into the SAP production System. The cutover period is the time between
the shutting down of the old system (and ceasing of all logistic related business activities) and
commencement of use of the SAP. During cutover the all processing on the legacy system is frozen
and the data extracted.

4.21 Reconcile the Migrated Data


This process checks that the data migrated into SAP meets the specified data requirements. This
includes, but is not limited to manual data checks, record counts, checking balances, running
reconciliation reports, approval of acceptable differences (rounding errors) etc.

4.22 Data Sign-off


After the migrated data has been reconciled and checked the data loads will be signed-off, the sign-off
will act as approval for the go-live of the SAP application.

Data Migration Strategy250297826.doc


06/11/2014

page 9 of 19

Data Migration Guiding Principles

5
5.1

Data Migration Approach

5.1.1 Master Data - (e.g. Customers, Assets)


The approach is that master data will be migrated into SAP providing these conditions hold:
The application where the data resides is being replaced by SAP.
The master records are required to support SAP functionality post-go-live.
There is a key operational, reporting or legal/statutory requirement.
The master data is current (e.g. records marked for deletion need not be migrated) OR is
required to support another migration.
The legacy data is of a sufficient quality such so as not to adversely affect the daily running of
the SAP system OR will be cleansed by the business/enhanced sufficiently within the data
migration process to meet this requirement.
Note: Where the master data resides in an application that is not being replaced by SAP, but is required
by SAP to support specific functionality, the data will NOT be migrated but accessed from SAP using a
dynamic query look-up. A dynamic query look-up is a real-time query accessing the data in the source
application as and when it is required. The advantages of this approach are;
-

5.1.2

Avoids the duplication of data throughout the system landscape.


Avoids data within SAP becoming out-of-date.
Avoids the development and running of frequent interfaces to update the data within SAP.
Reduces the quantity of data within the SAP systems.

Open Transactional data (e.g. Service Tickets)

The approach is that open transactional data will NOT be migrated to SAP unless ALL these conditions
are met:
There is a key operational, reporting or legal/statutory requirement
The legacy system is to be decommissioned as a result of the CF project in timescales that
would prevent a run down of open items
The parallel run down of open items within the legacy system is impractical due to operational,
timing or resource constraints
The SAP build and structures permit a correct and consistent interpretation of legacy system
items alongside SAP-generated items
The business is able to commit resources to own data reconciliation and sign-off at a detailed
level in a timely manner across multiple project phases

5.1.2.1 Historical Master and Transactional data


The approach is that historical data will not be migrated unless ALL these conditions are met:
There is a key operational, reporting or legal/statutory requirement that cannot be met by using
the remaining system

The legacy system is to be decommissioned as a direct result of the CF project within the CF
project timeline

An archiving solution could not meet requirements

The SAP build and structures permit a correct and consistent interpretation of legacy system
items alongside SAP-generated items

The business is able to commit resources to own data reconciliation and sign-off at a detailed
level in a timely manner across multiple project phases

Data Migration Strategy250297826.doc


06/11/2014

page 10 of 19

5.2

Data Migration Testing Cycles

In order to test and verify the migration process it is proposed that there will be three testing cycles
before the final live load:

Trial Load 1: Unit testing of the extract and load routines.

Trial Load 2: The first test of the complete end-to-end data migration process for each data
entity. The main purpose of this load is to ensure the extract routines work correctly, the staging
area transformation is correct, and the load routines can load the data successfully into SAP.
The various data entities will not necessarily be loaded in the same sequence as will be done
during the live cutover

Trial Cutover: a complete rehearsal of the live data migration process. The execution will be
done using the cutover plan in order to validate that the plan is reasonable and possible to
complete in the agreed timescale. A final set of cleansing actions will come out of trial cutover
(for any records which failed during the migration because of data quality issues). There will be
at least one trial cutover.. For complex, high-risk, migrations several trial runs may be
performed, until the result is entirely satisfactory and 100% correct.

Live Cutover: the execution of all tasks required to prepare SAP for the go-live of a particular
release. A large majority of these tasks will be related to data migration.

5.3

Data Cleansing

Before data can be successfully migrated it data needs to be clean, data cleansing is therefore an
important element of any data migration activity:

Data needs to be in a consistent, standardised and correctly formatted to allow successful


migration into SAP (e.g. SAP holds addresses as structured addresses, whereas some legacy
systems might hold this data in a freeform format)

Data needs to be complete, to ensure that upon migration, all fields which are mandatory in
SAP are populated. Any fields flagged as mandatory, which are left blank, will cause the
migration to fail.

Data needs to be de-duplicated and be of sufficient quality to allow efficient and correct support
of the defined business processes. Duplicate records can either be marked for deletion at
source (preferred option), or should be excluded in the extract/conversion process.

Legacy data fields could have been misused (holding information different from what this field
was initially intended to be used for). Data cleansing should pick this up, and a decision needs
to be made whether this data should be excluded (i.e. not migrated), or transferred into a more
appropriate field.

It is the responsibility of the data owner (i.e. the business) to ensure the data provided to the Customer
First Project for migration into SAP (whether this is from a legacy source or a template populated
specifically for the CF project) is accurate.
Data cleansing should, wherever possible, be done at source, i.e. in the legacy systems, for the
following reasons:

Unless a data change freeze is put in place, extracted datasets become out of date as soon as
they have been extracted, due to updates taking place in the source system. When reextracting the data at a later date to get the most recent updates, data cleansing actions will get
overwritten. Therefore cleansing will have to be repeated each time a new dataset is extracted.
In most cases, this is impractical and requires a large effort.

Data cleansing is typically a business activity. Therefore, cleansing in the actual legacy system
has the advantage that business people already have access to the legacy system, and are
also familiar with the application. Something that is not the case when data is stored in staging

Data Migration Strategy250297826.doc


06/11/2014

page 11 of 19

areas. In certain cases it may be possible to develop a programme to do a certain degree of


automated cleansing although this adds additional risk of data errors.

If data cleansing is done at source, each time a new (i.e. more recent) extract is taken, the
results of the latest cleansing actions will automatically come across in the extract without
additional effort.

Data Migration Strategy250297826.doc


06/11/2014

page 12 of 19

Data Migration Methods

There are four primary methods of transferring data from a legacy system into SAP. The most
appropriate method will depend on the volume and complexity of data to be migrated for each data
object.

6.1

Legacy System Migration Workbench (LSMW).

This is a facility provided by SAP that enables migrated data to be first cleansed in spreadsheets (or
access depending on the data volumes), and subsequently loaded into SAP via predetermined field
mapping. The facility also caters for data transformation where field values can be created from simple
predefined logic. It is the preferred tool for all data migration into SAP. The main benefits are:
A highly flexible tool provided and supported by SAP.
Independence from SAP releases, platforms and the kind of data to be migrated.
Independence from SAP releases, platforms and the kind of data to be migrated.
A step by step guide through the data migration process.
Generates ABAP code once data transfer has been configured.
It allows additional ABAP coding for complex data loads.
It allows reusability of data mapping and conversion rules.

6.2

LSMW process flow

The LSMW process flow is made up of the following steps:


Read data (legacy data in spreadsheet tables and/or sequential files) from a file on a local
PC or file server.
Convert data (from the source into the target format).
Import data (to the database used by the SAP application).
Legacy Data
on PC

StructureRelationships

Data that has


been read

Legacy Data on
Application
Server

Convert
Data

FieldAssignments

SAP
System

Converted
Data

SAP Application Server


SAP

Direct Input
Processing
Inbound
IDoc
Processing

SAP System

Batch Input
Processing

Conversion
Rules

6.3

Read
Data

Bespoke Program Development

This is suitable where the data to be migrated is complex and in a significantly different format from that
required by SAP and there is no standard SAP data load program available. This method will only be

Data Migration Strategy250297826.doc


06/11/2014

page 13 of 19

used as a last resort as it tends to create a large development overhead and is usually only justified if
the volume of such data is very large.

6.4

Manual Migration

If the volume of data to be migrated is relatively low or is of poor quality, the overhead of developing
migration programs might not be justified. Manually keying the data using the SAP transactions can
enable the SAP logic to validate the data at the time of entry. The main disadvantages are the
possibilities of input errors (mis-keying) and for multiple SAP rollouts the data must be manually input
each time.

Data Migration Strategy250297826.doc


06/11/2014

page 14 of 19

Data Object Structure (DOS)

For each data object identified as requiring data migration a data object structure (DOS) will be created.
The DOS is the central repository for the detailed information on the SAP data field definitions,
properties and attributes of a data object The DOS is a vital input to the task of writing functional
specifications for the migration of legacy data into SAP., it is used to:
Map legacy systems data to the SAP template data.
Design and develop legacy data extracts.
Design and construct data load templates.
Design and construct data conversion programs.
Define manual data input instructions for manual data entry.
Derive priorities and business rules for data cleansing.
Determine any data transforming requirements.
Design and construct interface programs.
There is one DOS for each Data Object or group of Data Objects in scope. The main purpose of this
document is to achieve:
A consistent look and feel for all DOS records
Clarification of the DOS columns
Standard values for columns across all DOS records
Clear rules for different scenarios
Better interpretation of an SAP data object.
Each DOS sheet is an Excel file that conforms to the following generic layout. See Appendix 1 for the sample
template
DOS Sheet Field

Mandatory/Optional

Comments/Values

Screen Name

Mandatory

This is the name at the top of the SAP screen when entering
fields. Enter once at the beginning of the section, not on each
line. For example Business Partner.

Field Description

Mandatory

Field Description, for example Customer Name

Table/Structure

Mandatory

SAP Table Name.

Field Name

Mandatory

SAP Field Name

Length

Mandatory

Length of field

Format

Mandatory

Example CHAR, DATS, CURR

Field Status

Mandatory

Field Status; R=Required, A=Automatic, C=Conditional,


O=Optional, NR=Not Required

Comments

Mandatory

Insert comments to assist with data mapping and transforming.


For example: Describe allowable values with the conditions for
deciding which value must be used

Source System

Mandatory

The name of the legacy system

Legacy
Table/Field

Mandatory

The name of the legacy table and field.

Conversion Rule

Mandatory

The Conversion Rule, describe the transformation values, any


fixed values or logical rules

Data Migration Strategy250297826.doc


06/11/2014

page 15 of 19

Data Object Register

The Data Object Register will contain a definitive list of all data objects considered for migration into
SAP, for each data entity responsibility will be assigned for each activity in the data migration process.
The data object register will contain all data objects considered for data migration and form the basis for
reaching a final decision on whether the data migration will take place for the data object in question.
The Data Object Register layout and content is detailed in the able below;
DOR Column

Column Description

Directorate

The name of the BCC directorate

Source Application

Source application name e.g 3Cs

Rollout Sequence

The rollout sequence of the data object

Data Object

The data object name e.g LLPG

Description

A meaningful description of the data object

Data Type

The type of data, master/transactional

Business Data Owner(s) (BCC)

The name of the business data owner within BCC

Service Birmingham Contact

The name of the SB contact

Workstream Business
Representative (BCC)

The name of the BCC business representative on the


Customer First Programme

Functional Consultant (CF)

The name of SAP functional consultant responsible

Workstream Data Analyst (CF)

The name of the data analyst responsible for the data


migration

Data Volume

The volume of data i.e. number of records

Data Complexity

How complex high, low, medium.

Data Quality

Quality of the data high, low, medium

Data Migration Recommended


(Yes/No)

The recommended data migration approach

Recommendation Rationale

The reasons for the data migration recommendation, e.g.


highly complex, very high development cost, dynamic data
lookup.

Agreed Data Migration Yes/No

Following consultation, the agreed decision on data migration.

Agreement Rationale

The reasons for the agreed migration decision.

Data Migration Strategy250297826.doc


06/11/2014

page 16 of 19

Roles and Responsibilities

The successful migration of data into SAP requires a significant and diverse input from many different
sources. A complete understanding of legacy systems and SAP data structures is necessary as well as
a functional and business understanding of the processes the data is supporting. For that reason the
each data entity has people assigned to oversee each of the following roles. The responsibilities of
each data entity team can be shared in whatever way suits them best but the likely division is shown
below.
Business Data Owner(s) (BCC)
o Overall responsibility for ensuring fit for use for business.
o Business sign off for: Strategy document, Trial cutover migration, Live migration.
o Ensuring necessary data cleansing/build takes place.
Workstream Business Representative (BCC)
o Facilitating flow of information between project team and business owners.
o Ensuring project team is considering ALL parts of the business
o Highlighting business critical data gaps
Functional Consultant (Customer First)
o Ensuring fit for use in SAP environment.
o Create strategy for field mappings and data build where appropriate.
o Functional support for technical analysts.
Legacy System Owner (Service Birmingham)
o Build of extract routines from legacy systems.
o Timely extraction of files for build/test phases, Trial Cutover and Live Cutover.
o Highlight any data gaps from legacy systems.
Workstream Data Analyst (Customer First)
o Design/Build of import routines to SAP
o Ensuring field mappings are complete.
o Timely load of files for build/test phases, Trial Cutover and Live Cutover.

Data Migration Strategy250297826.doc


06/11/2014

page 17 of 19

Key Issues & Risks

10

10.1 Current Data Migration Issues

Data Access - access to the data held within the BCC Specialist Service Delivery applications is
required to enable; data profiling, the identification of data sources and to write functional and
technical specifications.
BCC Resource Availability;
o Required to assist in data profiling, the identification of data sources and to create
functional and technical specifications.
o Required to develop and run data extracts from the councils specialist service delivery
systems.
o Required to validate/reconcile/sign-off data loads.
o Required for data cleansing.
The LLPG (Local Land and Property Gazetteer) and GIS are being created outside of Customer
First, they are required by the customer first solution and therefore must be managed as a
dependency.
Data cleansing is the responsibility of BCC, they will also require support in this process by
Service Birmingham. Customer First will help identify the data anomalies during the data
migration process however Customer First will not cleanse the data in the councils
applications. Depending on the data quality, data cleansing can require considerable effort, and
involve a large amount of resources.
The scope of the data migration requirements has not yet been finalised, as data objects are
identified they will be added on to the data object register.

10.2 General Data Migration Risks

Business resources are unable to confidently reconcile large and/or complex data sets. Since
the data migration will need to be reconciled a minimum of 3 times (system test, trial cutover
and live cutover) the effort required within the business to comprehensively test the migrated
data set is significant. In addition, technical data loading constraints during cutover may mean a
limited time window is available for reconciliation tasks (e.g. overnight or during weekends)
Business resources are unable to comprehensively cleanse the legacy data in line with the CF
project timescales. Since the migration to SAP may be dependent on a number of cleansing
activities to be carried out in the legacy systems, the effort required within the business to
achieve this will increase proportionately with the volume of data migrated. Failure to complete
this exercise in the required timescale may result in data being unable to be migrated into SAP
in time for the planned cutover
The volume of data errors in the live system may be increased if reconciliation is not completed
to the required standard. The larger/more complex a migration becomes, the more likely it is
that anomalies will occur. Some of these may initially go undetected. In the best case such
data issues can lead to a business and project overhead in rectifying the errors after the event.
In the worst case this can lead to a business operating on inaccurate data.
The more data migrated into SAP makes the cutover more complex and lengthy resulting in an
increased risk of not being able to complete the migration task on time. Any further resource or
technical constraints can add to this risk.
Due to the volume of the task, data migration can divert project and business resources away
from key activities such as initial system build, functional testing and user acceptance testing.

Data Migration Strategy250297826.doc


06/11/2014

page 18 of 19

Appendix 1 Data Object Structure Template

Screen

Field

Purpose/Description

Field
Name

Table/
Structure

Length

Format

Field
Status

Address Data
Title

Title key - e.g. Mr, Mrs, etc.

title

addr1_data

char

Name 1

Address component: Name of an address.

name1

addr1_data

40

char

Name 2

Address component: Name of an address.

name2

addr1_data

40

char

Name 3

Address component: Name of an address.

name3

addr1_data

40

char

Name 4

Address component: Name of an address.

name4

addr1_data

40

char

Search term 1

Short description for a search help.

sort1

addr1_data

20

char

Data Migration Strategy250297826.doc


06/11/2014

page 19 of 19

Comments

Source
System

Source
Table
Field Name

Conversion
Rule

You might also like