About Rentokil Terminix
Rentokil Terminix is a global leader in pest control and hygiene services, protecting people, enhancing lives, and preserving our environment. As part of our continued digital transformation, we are building data-driven solutions that empower our teams, enhance customer experience, and improve operational efficiency.
Role Summary
We are seeking a Data Engineer to design, build, and maintain robust data pipelines and platforms that support enterprise reporting, advanced analytics, and AI/ML initiatives. The ideal candidate will work closely with data scientists, analysts, and business stakeholders to ensure data is accurate, accessible, secure, and scalable.
Key Responsibilities
- Design, develop, and maintain data pipelines for ingesting, transforming, and integrating data from multiple sources (CRM, ERP, IoT devices, field service apps, etc.).
- Optimize data workflows for scalability, performance, and reliability in both batch and real-time processing.
- Collaborate with data analysts and scientists to deliver high-quality datasets that enable business intelligence and predictive analytics.
- Implement and enforce data quality, governance, and security standards.
- Work with cloud platforms (e.g., AWS, Azure, GCP) to build scalable data solutions.
- Support the development of data lakes, data warehouses, and data marts for enterprise reporting.
- Partner with business teams to understand requirements and provide data-driven solutions for customer insights, operational efficiency, and growth initiatives.
- Stay current with emerging technologies and best practices in data engineering.
Required Skills & Experience
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or related field.
- Proven experience as a Data Engineer, ETL Developer, or similar role.
- Strong expertise in SQL, Python, or Scala, Data pipeline tools (e.g., Apache Airflow, Luigi, Prefect), Big Data frameworks (e.g., Spark, Hadoop)
- Experience with cloud data platforms (AWS Redshift, Azure Synapse, Google BigQuery, Snowflake).
- Knowledge of data modeling, warehousing, and ETL best practices.
- Experience with API integrations, streaming platforms (Kafka, Kinesis, Pub/Sub), and structured/unstructured data sources.
- Strong understanding of data governance, security, and compliance.
- Excellent problem-solving and communication skills with the ability to work cross-functionally.
Preferred Qualifications
- Experience in field services, logistics, or customer-facing industries.
- Familiarity with IoT data processing (e.g., smart traps, sensors, or telematics).
- Knowledge of machine learning model deployment pipelines (MLOps).
- Experience with DevOps practices (CI/CD, containerization with Docker/Kubernetes).
Job Type: Full-time
Pay: ₦3,000,000.00 - ₦6,138,000.00 per month