Data Engineer at Interswitch Group

July 27, 2023
Application deadline closed.

Job Description

Interswitch is an Africa-focused integrated digital payments and commerce company that facilitates the electronic circulation of money as well as the exchange of value between individuals and organisations on a timely and consistent basis. We started operations in 2002 as a transaction switching and electronic payments processing, and have progressively evolved into an integrated payment services company, building and managing payment infrastructure as well as delivering innovative payment products and transactional services throughout the African continent. At Interswitch, we offer unique career opportunities for individuals capable of playing key roles and adding value in an innovative and fun environment.

Job Purpose

  • To understand data sets collected from different sources, to put them in a centralised data store ready for analytics and data science and to transform them for business intelligence or machine learning.
  • To apply software engineering principles to ETL development using a new set of tools. To create
  • and enhance products which benefit from the power of distributed processing. To understand the infrastructure and data architecture to identify reasons for successful and failed products.


Data Architecture:

  • Consult and educate stakeholders on methods for streamlining and standardising data recording to ensure quality and accuracy.
  • Develop, construct, test and maintain architectures, such as databases and large-scale processing systems to enable the usability, accessibility and transformation of data.
  • Collaborate with other Data Architecture teams within Interswitch to identify synergies and drive a cohesive approach to data governance and architecture across the payment processing.

Performance Improvement through Business Intelligence:

  • Support creation of machine learning algorithms by applying standard statistical analysis or data preparation methods.
  • Employ regression modelling to model the relationship between unique sets of data through statistically-proficient and engineering means.
  • Study consumption trends and segmentation analysis to inform marketing and business performance metrics.
  • Narrow big datasets down to series of data that is presented to key business stakeholders.

Internal Communications and Capability Building:

  • Help others get the most out of internal communications systems by offering support and advice.
  • Serve as an advocate for data-driven product design, evangelise insights on what is working and what is not to help drive incremental gains in pipeline and revenue.
  • Act as mentor and coach to team members while fostering an environment of mutual respect and trust among senior-level team members.
  • Maintain an understanding of relevant technology, external regulation and industry best practices through ongoing education, attending conferences and reading specialist media.
  • Deepen understanding of underlying structures that constitute a data pipeline and manipulate these structures to ensure quality assurance and to enable data scientists to build effective algorithms, models and tools.
  • Use statistical and engineering techniques to apply data processing tools to real world business structures.

Technical Database Support:

  • Design distribution of basic database resources and provide physical modeling and design services to tune database applications for optimum performance.
  • Assist in the development of large scale data structures and pipelines to organise, collect and standardise data that helps generate insights and addresses reporting needs.
  • Apply understanding of key business drivers to inform decision-making.
  • Uses expertise, judgment and precedents to contribute to the resolution of moderately complex problems.
  • Interface with cross-functional teams to define data requirements and source data, either in terms of raw data or modelled data and logic to support insights, dashboard, and report requirements that inform decision-making.

Advanced and Predictive Analytics:

  • Support interpretation of advanced and predictive analytics data, using specialised software tools and functionalities.
  • Understand a myriad of technologies in-depth, pick the right tool for the job and write code in Scala, Java, or Python to create resilient and scalable solutions.
  • Assess performance and recommend next steps as new products are launched and provide new measurement methodologies on key metrics to assess the success and failures of a product.



  • University First Degree in Computer Science, Information Technology, Statistics, Mathematics, Finance or related fields.


  • At least 6 years’ relevant experience in Data Management roles ideally within financial or FinTech institutions, including a minimum of 3 years in data engineering