Senior Big Data DevOps Engineer

  • Competitive
  • Newport Beach, CA, USA
  • Permanent, Full time
  • NJF Search
  • 20 Nov 17 2017-11-20

Our client is a global investment firm headquartered in southern California are currently looking for a Senior Big Data DevOps Engineer to manage their Big Data infrastructure based proprietary trading systems, risk management systems, and portfolio management report applications.

Responsibilities

  • Responsible for building enterprise monitoring solutions, implementing configuration management and runbook automation, standardizing processes and managing big data applications and underlying infrastructure to help steer scalability and stability improvements early in the lifecycle of development while ensuring operational best practices are supported.
  • The ideal candidate is an energetic self-starter and a team player having a passion for software engineering and building automation, who works well under pressure and has strong communication and technical skills.

 

Requirements

  • Minimum Bachelor’s degree in computer science or a related field.
  • Minimum of 3 years’ experience on Big Data technologies with expertize in HDFS, Yarn, Spark, Hive/Impala, Kafka, Oozie.
  • Minimum of 4 years of DevOps or development and operations combined experience
  • Hands-on experience managing distributed systems and clusters
  • Expertise in scripting technologies like Python 
  • Understanding of C++/Java and microservice/SOA architecture
  • Data Mining and Machine Learning experience with excellent quantitative analytics skills is a bonus
  • Excellent communications skills, possess strong problem solving, analytical, time management skills.
  • Experience analyzing and resolving performance, scalability and reliability issues.