You are on page 1of 5

Mapping Instructions Load Progressive Delta data into Targets from Live Source

Often times, Informatica Developers get into situations to develop mappings to pick only Changes from Source Tables from previous runs and still want the control to pick up data slices within the Time Window. Here is one I would like to share which we implemented at a Retail Client systems. Objectives: 1. Ability to Pick up Source Data from Previous Successful Run but not depend on Informatica Time Stamps. 2. Ability to Overwrite the Time Window to pick up Source Data, on request. 3. Ability to switch Source/Target Database connections on the fly. Steps: 1. In Power Center Designer, Create a mapping by importing Source and Target. 2. Create mapping variables to hold previous and current Load Timestamps - $$m_LAST_LOAD_TS and $$m_CURR_LOAD_TS

3. Create a SQ filter to pick rows from the Source Table based on Time Stamp. This helps to pick up only rows you would want to be pulled from where you left off in previous run. Here $$m_LAST_LOAD_TS is mapping level variable which keeps track of the Previous Load Timestamp.

RK Nalluri |Data Architect @ DatafactZ Inc

4. Add an Expression Transformation after the SQ, to pull all columns you want to load into Target. Along with that, create below port as variable. v_CURR_LOAD_TS

In the Expression, add SETVARIABLE to set ROW TimeStamp. Remember, while declaring mapping variable, we set Aggregation to MAX. So, for every row it processes, it keeps MAX of all row Timestamps.

RK Nalluri |Data Architect @ DatafactZ Inc

5. Complete the Mapping by joining Expression ports to Targets. Thats all!! Your mapping is complete!! Now, go to Workflow Manager and create a new Workflow. 1. Select EDIT option from Workflows menu and declare workflow level Variables.

Once you save the settings. Create a session from the Mapping we have created. 2. Right click on the Session to EDIT Session details. 3. Under Components Tab, click on presession_variable_assignment.

RK Nalluri |Data Architect @ DatafactZ Inc

4. Move LAST_LOAD_TS from workflow into mapping. As we defined $$wf_LAST_LOAD_TS as Persistent, it keeps the value from previous workflow run!!

5. Once you save these settings, click on postsession_success_variable_assignment. Here you save TimeStamps from mapping to Workflow.

6. Save the settings and configure all connections according to your environment. 7. Add these mapping variables in the PARM file and do not assign any values. RK Nalluri |Data Architect @ DatafactZ Inc 4

8. When you first run the entire workflow, it takes default date value from system and loads your data from source table. After loading, its keeps the MAX of ROW_TS as $$wf_LAST_LOAD_TS, if the workflow is successful. Otherwise, it will be still prevous Time Stamp. After every successful run, it keeps new Timestamp rolled up. 9. In future, if there is a need to load source data from a specific timestamp, you can do that by editing the PARM file and assign timestamp value. $$wf_LAST_LOAD_TS=01/01/2012 01:01:00 10. Without any changes in you mapping, if you run with override Timestamp from PARM file, your workflow picks up Source Data from that TimeStamp and starts rolling up $$wf_LAST_LOAD_TS every run after that successively.

RK Nalluri |Data Architect @ DatafactZ Inc

You might also like