We are seeking a Senior Data Engineer to join our Platform team. You will be responsible for ensuring that all data pipelines run smoothly across our businesses and that all data warehouses are safe, secure, and optimised, so analytics and business intelligence teams have efficient data access.
Responsibilities Data Infrastructure Management
Collaborate with portfolio companies to define data transfer formats for centralised storage on Amazon S3.
Design, build, and maintain scalable data pipelines using Airflow and dbt.
Develop and maintain the ClickHouse data warehouse, ensuring scalability, efficient data storage and retrieval to support analytics and reporting.
Setup CI/CD procedures
Data Quality and Monitoring
Monitor and troubleshoot data workflows to ensure data integrity, consistency, and quality.
Maintain data quality by implementing necessary automated validation processes.
Create efficient data processing workflows and automation tasks using Python.
Technology Advancement
Stay current with the latest advancements in Data Engineering, including cloud technologies and their integration into workflows.
Qualifications Technical Expertise
5+ years of experience as a Data Engineer.
Strong SQL skills and familiarity with data warehousing concepts.
Strong experience with PostgreSQL and ClickHouse, including data modelling and complex query execution and optimisation.
Strong experience with Python for data processing and automation tasks.
Strong experience with Airflow for orchestrating ETL processes.
Experience with dbt for data quality and transformations.
Experience with Data Quality frameworks, methodologies, and tools (e.g., automated validation processes).
Experience working with on-premise environments and understanding the challenges of integrating with cloud-based systems.
Experience with AWS or Google Cloud, especially with cloud data warehouses.
Soft Skills
Strong problem-solving skills and attention to detail.
Ability to collaborate with cross-functional teams, including data analysts, engineers, and business stakeholders.
Preferred Qualifications
Bachelor's or Master's degree in Computer Science, Engineering, or a related field.
Experience working in startups or fast-paced environments.
Experience working in the iGaming industry ecosystem.
Experience with Kafka, Flink, or other streaming technologies for designing and maintaining real-time data pipelines.
Experience with BI tools such as Tableau and Superset.
Knowledge of data governance and security frameworks.
If this role sounds like a fit — we’d love to hear from you! Just send over your CV and anything else you’d like us to consider.
We’ll review everything within five working days, and if your background matches what we’re looking for, we’ll get in touch to set up a call and get to know each other better.