Role : Technical and functional Hadoop specialist 2-8 years of relevant industry experience
Skills
- General operational expertise such as good troubleshooting skills, understanding of systems capacity, bottlenecks, basics of memory, CPU, OS, storage, and networks.
- Hands-on experience in working with Big Data platforms like Hadoop (HortonWorks, Cloudera, MapR etc.),
- Understanding on Apache Spark, HDFS, Map Reduce, Hive, Nifi, Sqoop, Python or Scala, or any programming languages based deployment.
- Full knowledge of Hadoop Architecture and HDFS
- Hands on experience in UNIX, SQL, Shell Scripting.
- Knowledge on databases (RDMS, SQL, NoSQL) and should be able connect to databases from Streaming application using Python/Scala.
- Hands on experience in Kakfa interfacing Spark Streaming application and HDFS.
- Must have worked in Product support & troubleshooting issues
- Must have experience in Incident and change Management.