Job Overview
- Overall 10+ years of experience in technical leadership in Business Technology, Data management platform/Data warehousing technologies and Big Data / Cloud technology areas.
- Must have designed and implemented Big Data solutions for at least 2 end-to-end large-scale systems.
- Experience in architecting, optimizing, & maintaining large enterprise systems
- Experience with Hadoop Ecosystem (Hadoop ecosystem includes Sqoop, Flume, Zookeeper, Oozie, Mahout, HBase, Sentry, HCatalog, Hue, Drill, Impala,Tez, Ambari, Chukwa.. etc )
- Experience with AWS cloud/ Azure.
- Expert knowledge of SQL and PL/SQL
- Experience in providing Infrastructure Recommendations, Capacity Planning and develop utilities to monitor cluster better
- Experience around managing large clusters with huge volumes of data
- Experience with cluster maintenance tasks such as creation and removal of nodes, cluster monitoring.
- Experience with cluster troubleshooting. Manage and review Hadoop log files.
- knowledge of at least couple of programming languages (Java, Python, PHP etc.)
- Knowledge in Data Analysis & Logical Data Modelling
- Business analysis capability to work with HLDD, LLDD and other design documents
- Experience in writing MapReduce programs, UDFs, Hive query language, Pig scripts.
- Should be able to create the POCs for each problem/scenario/cases.
- Good Linux and shell scripting background.
- Experience with at-least one NoSQL technologies – MongoDB, Cassandra, HBase, Couchbase… etc.
Job Detail
-
Gender*Both
-
Qualification*MCA, B.E, B.Tech
-
Experience*10 Years+
-
Salary*Best in Industry