You are on page 1of 4

NAME

Phone: (M) +91 xxxxxxxxxx


Email: ******@gmail.com

Seeking to continue my professionalism, where my dedication, expertise and talent


will definitely bring the organization creating a sharper edge within the industry.
Aspire for a challenging position to contribute to the company's growth and in turn
ensuring personal growth within the organization

 Deep Understanding in Hadoop Ecosystem.


 ETL concepts with focus on delivering business solutions
 Good Knowledge in Pig,Hive,Sqoop,HBase,Flume
 Broad exposure to Informatica power centre
 Good knowledge in Unix shell scripting and batch schedules in Cron-tab scheduler
 SQL and DBMS concepts

PROFESSIONAL BACKGROUND

 COMPANY#1. Jun 2010 – Oct 2013


 COMPANY#2. Nov 2013 – Till Date

TECHNICAL PROFICIENCY

ETL Tool: Informatica 8.5,9.1


Reporting: Pentaho_5.0.6
Operating Systems: Windows, UNIX,RHEL, Cent OS
Languages: SQL
Database: Oracle,HBase
Hadoop Ecosystem : HDFS,Hive,Sqoop,Pig,Oozie,Flume

PROJECTS CONTOUR

Lottamatica Datawarehouse Migration Process Phase 0

Client: Lottamatica Gaming Tech

Description: Lottamatica Gaming Tech is world's biggest online gaming group, also
provides various other gaming services and related technologies. LGT maintains different
formats of the data (both structured and unstructured data) that is collected across the
entire organization are growing day by day and the data has been stored in a legacy
data warehouse maintained by the third party EBO. Due to huge cost involved in data
maintenances, LGT has decided to migrate the legacy data warehouse to HDFS which
provides cost savings. A proof of concept model has been devolved on the migration.

Key Deliverables:

 Installation Open-source Hadoop in Cent-OS


 Set-up Multi node Hadoop Cluster
 Analysis of the dataflow, sample ETL datasets into HDFS
 Installation of Hive,Pig,HBase,Sqoop,Flume,Oozie
 Data summarization, query, and analysis using Hive,Pig
 Data transferring between RDBMS and HBase using Sqoop
 Streaming data set-up using flume
 Coding and monitoring the workflow as per logics using Oozie

Lottamatica Datawarehouse Migration Process Phase 1

Client: Lottamatica Gaming Tech

Description: Post approval statement of work on the POC model the data
migration from the legacy datawarehouse into HDFS. Datawarehouse are divided into
datamarts and the migration of historical data under transaction datamart and
regulatory datamart are migrated using CDH Enterprise package into HBase database.

Key Deliverables:

 Understanding the Data model and Datamarts entity.


 Analysis of the dataflow from legacy datawarehouse into HDFS
 Interacting CDH cluster using HUE.
 Dataset arrangement in HDFS using cloudera file browser
 Data transferring between RDBMS and HDFS using Sqoop
 Unit testing on data validations on HBase.

HSBC Compliance

Client: HSBC
Role: ETL Developer
Description: Transactions dump,CDM,CMS,CTA,LTS Data are maintained in
datawarhouse for analyzing and to provide the reports to the card
associations Visa and Master. Twice in the year the release business
operations and system processing modifications for card issuers,
acquirers and processors are to be updated. Each release details
processing standards to which issuers/acquirers and their processors
must adhere in order to remain compliant with association rules and
regulations for the acceptance.

Key Deliverables:

 Understanding the Business Requirements and Develop an ETL process to load data
from source to target.
 validating all the sources received from different source systems.
 Involved in development of Informatica mappings such as the Source qualifier,
Aggregator, Expression, Router, Filter, Rank, Sequence Generator, Lookup and
Update Strategy etc...
 Created and Monitored Informatica sessions and workflows.
 Extensively worked on unit testing.
 Implemented Slowly Changing Dimension methodology for historical data.

EBBS Datewarhousing Maintance

Client: Scope International


Description: The project involves the process of replacing the existing SBS core
banking system and implementing the new eBBS core banking
system in Germany. Since Multiple Interfaces are connected to single
core system, Data from different interface are received, Reports will
be generated and shared to the regulatory boards

Key Deliverables:

 The responsibilities vary from gathering requirement to deployment.


 Gathering /Non-Functionalcustomer for the project and coordinating with them to
plan and deploy the releases to Production
 Co-ordinate with the Loans, iBanking and ensure smooth hand-over.
 Prepared design documents in Design Phase.
 Developed Mappings, Sessions and Workflows to extract data from Source and
loaded into Target database by using Transformations.
 Prepared unit testing the Code (Mappings and Workflows) during Coding Phase.
 Preparation of test data and performing System Testing for all Test cases in System
Testing Phase.
 Generating reports and SFTP to Audit team.

ACADEMIA

 B.E (Electronics & Communications) from MNM Jain Engineering College,


Affiliated to Anna University in 2009 with First Class
 Higher Secondary from Alpha Matriculation with First Class
PERSONAL MINUTIAE

Date of Birth: DD MM YYYY


Mailing Address: *************

Languages Known: English & Tamil


Reference: ********

(NAME)

You might also like