ABSA Career – Specialist Hadoop Data Engineer

Website Absa

Job Description:

The Risk Services team within Group Shared Services is looking for a Hadoop Developer to work embedded as a member of squad OR; across multiple squads to produce, test, document and review algorithms & data specific source code that supports the deployment & optimisation of data retrieval, processing, storage and distribution for a business area.

Job Responsibilities:

  • Ensure the quality assurance and testing of all data solutions aligned to the QA Engineering & broader architectural guidelines and standards of the organisation
  • Understand the technical landscape and bank wide architecture that is connected to or dependent on the business area supported in order to effectively design & deliver data solutions (architecture, pipeline etc.)
  • Design data retrieval, storage & distribution solutions (and OR components thereof) including contributing to all phases of the development lifecycle e.g. design process
  • Create & Maintain Sophisticated CI / CD Pipelines (authoring & supporting CI/CD pipelines in Jenkins or similar tools and deploy to multi-site environments – supporting and managing your applications all the way to production)
  • Participate in design thinking processes to successfully deliver data solution blueprints
  • Develop high quality data processing, retrieval, storage & distribution design in a test driven & domain driven / cross domain environment
  • Implement & align to the Group Security standards and practices to ensure the undisputable separation, security & quality of the organisation’s data
  • Leverage state of the art relational and No-SQL databases as well integration and streaming platforms do deliver sustainable business specific data solutions.
  • Meaningfully contribute to & ensure solutions align to the design and direction of the Group Infrastructure standards and practices e.g. OLA’s, IAAS, PAAS, SAAS, Containerisation etc.
  • Monitor the performance of data solutions designs & ensure ongoing optimization of data solutions
  • Meaningfully contribute to & ensure solutions align to the design & direction of the Group Architecture & in particular data standards, principles, preferences & practices. Short term deployment must align to strategic long term delivery.
  • Build analytics tools that utilize the data pipeline by quickly producing well-organised, optimized, and documented source code & algorithms to deliver technical data solutions
  • Automate tasks through appropriate tools and scripting technologies e.g. Ansible, Chef
  • Translate / interpret the data architecture direction and associated business requirements & leverage expertise in analytical & creative problem solving to synthesise data solution designs (build a solution from its components) beyond the analysis of the problem
  • Support the continuous optimisation, improvement & automation of data processing, retrieval, storage & distribution processes

Job Requirements:

  • Ability to model complex data types and structures.
  • Scala
  • Bachelor’s Degree: Information Technology
  • IT Degree as well as Hadoop certifications
  • Knowledge of AWS or Azure (cloud) big data services would be an advantage
  • 5 years of Hadoop experience
  • Python
  • Spark
  • SQL

Job Details:

Company: Absa

Vacancy Type: Full Time

Job Location: Randburg, Gauteng, SA

Application Deadline: N/A

Apply Here