You are on page 1of 2

Job Title: Big Data Architect

Location: Phoenix AZ

The Big Data Architect will be responsible for designing, architecting and implementing the
data processing systems capable of processing, storing and distributing data in robust Big
Data Solutions. Working with multi-technology and cross functional teams and clients, you
should be able to manage the entire life cycle of a solution. You should be able to take
architectural decisions and provide a technology leadership and direction to the
organization.

Position Activities and Tasks

Should be able to design complex and high performance architecture


Developing and maintaining strong client relations with senior and C-level
executivesdeveloping new insights into the clients business model and pain points,
and delivering actionable, high-impact results
Participating and leading client engagement in developing plans and strategies of
data management processes and IT programs for the clients, providing hands on
assistance in data modeling, technical implementation of vendor products and
practices
Facilitating, guiding, and influencing the clients and teams towards right information
technology architecture and becoming interface between Business leadership, Tech
leadership and the delivery teams
Leading and mentoring other IT consultants within the practice and across business
units
Supporting business development and ensuring high levels of client satisfaction
during delivery
Contributing to the thought capital through the creation of executive presentations,
architecture documents, and IT position papers
Provide best practice advice to customers and team members
Participate with End-users and do requirement gathering and convert into technical
documentation
Identify the performance bottle-necks and resolve the same
Knowledge of Big Data features and Plug-ins
Knowledge of Travel/Transportation/Hospitality domain is Big plus

Professional Experience Required

12-15 years of experience in designing, architecting and implementing large scale


data processing/data storage/data distribution systems
Extensive experience working with large data sets with hands-on technology skills to
design and build robust Big Data solutions
Ability to work with a multi-technology/cross-functional teams and customer
stakeholders to guide/managing a full life-cycle of a Hadoop solution
Extensive experience in data modeling and database design involving any
combination of -
Data warehousing and Business Intelligence systems and tools
Relational and MPP database platforms like Netezza, Teradata
Open source Hadoop stack
Hands-on administration, configuration management, monitoring, performance
tuning of Hadoop/Distributed platforms
Strong understanding of Big Data Analytics platforms and ETL in the context of Big
Data
Ability to frame architectural decisions, provide technology leadership & direction
Excellent problem solving, hands-on engineering skills and communication skill
Knowledge/experience of cloud computing infrastructure (e.g. Amazon Web Services
EC2, Elastic MapReduce)
Broad understanding and experience of real-time analytics

Technical Skills Required

Hadoop : HDFS, MapReduce, Hive, Hbase, Pig, Mahout, Avro, Oozie, GRAPH DB spark
streaming

NoSQL : Cassandra, MongoDB, Hbase

Appliances : Teradata, Netezza, Greenplum, Aster Data, Vertica

Languages : Java, Linux, Apache, Perl/Python/PHP

You might also like