You are on page 1of 2

I looked the fist 4 MHC jobs.

Below are the changes you need to make,


1) In the tranformer stage NULL_HANDELING (first transformer), change the deriva
tion to
If Len(LnkFile_Clm3StuInc.SYSTEM_SITE_CLAIM_ID)=0 Then '^' Else LnkFile_Clm3StuI
nc.SYSTEM_SITE_CLAIM_ID
I made the changes in the first 3 columns, please apply the same logic to rest o
f columns.
After looking into the job, I realized that there is no null value in the flat
file. If the column is empty, it will be all blank in the file.
2) In stage REMOVE_DUPS_FLLE (the first remove dups stage), the keys should be a
ll columns.
Here we want to remove the turely duplicate records which means that all column
s are identical, do we use all columns as remove dups keys.
3) In stage ASSIGN_PARM (the last tranformer stage), the derivation should look
like
If (ResetNullLnk.CLAIM_LINE_NUMBER='^' ) Then SetNull() Else ResetNullLnk.CLAIM_
LINE_NUMBER
Because in the first transformer stage, we assinged an empty column to '^', so w
e need to assign '^' back to NULL.
I made the changes in the first 3 columns, please apply the same logic to rest o
f columns.
4) Change the source database stage name from STG_MEMBERS_1 to STG_CLM_LINE1
You can refer the template at HSADR/Templates/HSADR_LoadMHC_Mem1.
I looked the job HSADR_StgDeltaDntlClaimCDC and created a new job HSADR_StgDelta
DntlClaimCDC_Helen.
I made some changes in the transformer for stage variables, column derivations,
and link constrains.
The job now loads data to the table. I also found out that the join table used
the warehouse vision table instead of warehouse detal table.
John suggested you focus on MHC jobs. I'll take care of Detal Claim job.
Please let me know if you have any question.
The issue is data type mismatched. The column ORTHO_TREAT_DATE_PLACED is char(
8) before getting to transformer, but there is no type convert
made in the transformer. You should use the stage variable ValOrthTrtDt in the
derivation.
If ValOrthTrtDt=1 Then StringToDate(LnkDntPro.OrthoTreatDatePlaced,"%yyyy%mm%d
d")
-- which is valid date format
Else '1900-01-01'
Use the same logic for all date columns.
Please also note, the two stage variable ErrFlag and SerFlag don't have derivat
ion on them. They should be,
For ErrFlag
If ValOrthMthRem>1 or ValOrthTrtDt>1 .....
Then 1
Else 0
For SevFlag
If ValOrthMthRem>3000 or ValOrthTrtDt>3000 .....
Then 1
Else 0
We used both SYSTEM_SITE_CLAIM_ID and CLAIM_AND_STATUS_SEQUENCE_ID as Key_IDs.
Shall we go with 2 IDs or to be used only one ?
Use SYSTEM_SITE_CLAIM_ID and CLAIM_AND_STATUS_SEQUENCE_ID as Key_IDs is correc

t and compare the rest of columns.


We followed the same link and stage naming conventions , so please update us if
needed any changes.
You should modify stage names corresponding to the file name. For example, chan
ge source sequential file stage from File_Members_1 to File_Clm3StuInc,
and target ODBC stage from INS_STG_MEMBERS1 to INS_STG_CLAIMS_3_STATUS.
Hi John,
We have facing with some issues after we started to work on the Tech Specs. The
issues are pointed out below:
1. We need to know the source path for the files mentioned the below table that
was shared with us couple of days back. Without this we will not be able to run
the jobs.
The original files at /dsfileset/test4.
2. We need to know the ODBC source path i.e. TXT files path (Fixed-width) files
location.
You don't use ODBC text file stage. The job generated from Fast Track uses text
file, then you need to remove the ODBC text file and use a sequential file stag
e to instead of. You need to copy the file from /dsfileset/test4 to process dir
ectory /work_area/mhc_source_files/test/. Since original files are DOC format,
you have to remove MSDOS CRLF characters. I used the command dos2unix, or you c
an use your way to do it.
3. When we execute the job, parameter(pBatchFileDir) source batch-number which i
s defaulted to 1 (initially) but we are not able to run with the default value.
But we can run this job when we remove the default value i.e. not using 1 as th
e parameter.
The pBatchFileDir will be passed by sequencer in the future. Now you can modify
the file name to 1_<FileName> in directory /work_area/mhc_source_files/test/ b
ecause we already default the pBatchFileDir to 1 in Parameter set, so you shoul
d be able to run the job.
4. Based on Helen's comments, we need to point to adr_prestaging_dev.STAGING_MHC
_TABLENAME but still it is appearing as hsdev_STAGING_MHC.TABLENAME. Helen also
informed that it is something to do with the ODBC configuration and let us know
if it is resolved.
I sent an email last night to clarify the ODBC stage setting.
Data source
#HSADR_ParameterSet_2.pDBNamePreStg# which is adr_prestaging_
dev
Table name
STAGING_CMS.<tablename>
You should be able to connect to tables.

You might also like