We are toogeza, a Ukrainian recruiting company that is focused on hiring talents and building teams for tech startups worldwide. People make a difference in the big game, we may help to find the right ones.
Currently, we are looking for a Data Engineer /Developer for Spinlab.
Location: Remote
Job Type: Full-Time
About our client:
We help slot gaming leaders unlock the potential of their data, enhancing business outcomes and strengthening their competitive edge in the market. We collect and process data using advanced methods and technologies to provide our clients with clear, actionable recommendations based on real metrics. SpinLab’s goal is not just to collect data but to help businesses use it for maximum efficiency and amplify their performance results.
About the Role:
We’re looking for a hands-on Data Engineer with a strong focus on data infrastructure, analytics collaboration, and cost-efficient operations. In this role, you will develop and optimize ETL processes, ensure data reliability and scalability, and work closely with analytics and product teams to support data-driven decision-making.
You will also contribute to the effective management of cloud resources, with a focus on automation, cost optimization, and continuous improvement.
Responsibilities:
Understand, format and prepare data for analytics and data-science processes.
Design, build, and optimize scalable ETL/ELT pipelines for batch and streaming data.
Collaborate with analysts to understand data needs and ensure accessible, well-modeled data sets.
Dive deep into system metrics and usage patterns to identify opportunities for FinOps-driven cost savings.
Manage data infrastructure on GCP (BigQuery, Cloud Composer,Vertex AI, Kubernetes, etc.).
Automate infrastructure provisioning using Pulumi or Terraform.
Set up data quality monitoring, alerting, and logging systems.
Collaborate with data scientists and ML engineers to productionize models and build supporting pipelines.
Continuously improve performance, scalability, and cost-efficiency of data workflows.
Requirements:
Strong experience with Python and SQL for data engineering.
Solid understanding of cloud platforms (ideally GCP) and data services (BigQuery, Cloud Storage, etc.).
Hands-on experience with Infrastructure-as-Code tools like Pulumi or Terraform.
Experience with Airflow, dbt, or similar orchestration/transform tools.
Proficiency in Docker and Kubernetes for data workflows.
Understanding of Linux systems, cloud networking, and security best practices.
Experience with CI/CD pipelines and version control (GitLab or similar).
A mindset for continuous improvement, optimization, and working cross-functionally.
Will be a plus:
Previous exposure to FinOps practices or cost-optimization work in cloud environments.
Experience with ClickHouse.
Experience with AWS.
Familiarity with iGaming, B2B SaaS, or Fintech domains.
Experience supporting data science/ML workflows in production.
Cloud/data-related certifications.
Benefits:
Work on meaningful data products and shape them with your vision.
25 vacation days + birthday leave.
Flexible, remote-friendly culture with a small, dedicated team.
English classes with native speakers.
Health insurance.
Annual education & development budget.
What’s next?
If this role sounds like a fit — we’d love to hear from you! Just send over your CV and anything else you’d like us to consider.
We’ll review everything within five working days, and if your background matches what we’re looking for, we’ll get in touch to set up a call and get to know each other better.