AVP, Data Engineer, Group Consumer Banking and Big Data Analytics Technology, Technology & Operations
Business Function Group Technology and Operations (T&O) enables and empowers the bank with an efficient, nimble and resilient infrastructure through a strategic focus on productivity, quality & control, technology, people capability and innovation. In Group T&O, we manage the majority of the Bank's operational processes and inspire to delight our business partners through our multiple banking delivery channels. Responsibilities
- Design and build from scratch for enterprise observability infrastructure
- Create and maintain optimal data pipeline.
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing Jobs/code for greater scalability, etc.
- Work with stakeholders including the Product owner, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Working SQL knowledge and experience working with RDBMS, NoSQL DB and Time-Series Database
- Experience with Kubernetes and Python programming language
- Experience building and optimizing 'big data' data pipelines, Jobs and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with structured and unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large datasets.
- Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.
- Experience supporting and working with cross-functional teams in a dynamic environment.
- We are looking for a candidate with 5+ years of experience in a Data or Software Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
Apply Now We offer a competitive salary and benefits package and the professional advantages of a dynamic environment that supports your development and recognises your achievements.
- ELK, Kafka & Grafana
- Relational SQL and Time-Series databases, such as MariaDB and InfluxDB
- Data pipeline and workflow management tools: Airflow, Nifi, etc.
- Stream-processing systems: Spark-Streaming, Flink etc.