· Resource will be responsible for developing new jobs using Spark and Scala programming.
· Resource will also responsible for testing the developed jobs and preparing test documents.
· Should have good understanding and knowledge of optimizing the performance of Spark jobs
· Extensive data processing knowledge on Hadoop system using Spark and Scala
· Resource must be able to convert the business rules as per the requirements document into Spark code
· 3-6 Years of work experience on the Hadoop systems with good understanding and knowledge of Hadoop cluster.
· Person must have good data analytical skill
· Good knowledge and hands on in writing SQL.
· Good knowledge on python programming.
· Good communication skill