BrainRocket is a global company creating end-to-end tech products for clients across Fintech, iGaming, and Marketing. Young, ambitious, and unstoppable, we've already taken Cyprus, Malta, Portugal, Poland, and Serbia by storm. Our BRO team consists of 1,300 bright minds creating innovative ideas and products. We don’t follow formats. We shape them. We build what works, launch it fast, and make sure it hits.
The Data Engineering Lead’s role is to spearhead our data engineering efforts, utilizing expertise in Apache Airflow, Snowflake, and team leadership.
This role involves not only hands-on technical responsibilities but also leadership in guiding, mentoring, and managing a team of data engineers to architect and maintain robust data solutions.
✅ Responsibilities:
✔️Leadership: Lead the design, development, and maintenance of scalable data pipelines using Apache Airflow and Snowflake. Provide technical guidance, best practices, and mentorship to the data engineering team.
✔️Team management: Manage a team of data engineers, fostering a collaborative and innovative environment. Assign tasks, set goals, and conduct performance evaluations to ensure the team's success and growth.
✔️Architecture and Strategy: Drive the architecture and strategy for data infrastructure, ensuring scalability, reliability, and efficiency. Collaborate with cross-functional teams to align data engineering initiatives with business objectives.
✔️Data warehousing expertise: Oversee Snowflake data warehouse management, including schema design, optimization, security, and performance tuning. Ensure adherence to best practices and governance standards.
✔️ETL Implementation: Lead the implementation of complex data workflows and scheduling using Airflow, ensuring robustness, monitoring, and optimization.
✔️Collaboration and Communication: Collaborate with stakeholders, data scientists, analysts, and other teams to understand data requirements. Communicate effectively to translate business needs into technical solutions.
✔️Continuous Improvements: Drive continuous improvement initiatives, identify areas for enhancement, and implement best practices to optimize data engineering processes and workflows.
✅ Requirements:
✔️Technical Expertise: Proven experience with Apache Airflow (MWAA) and Snowflake, including designing and implementing scalable data pipelines and data warehouse solutions.
✔️Cloud-Based Stack: Hands-on experience with AWS services (especially S3 and MWAA), Python, and SQL within cloud-based environments.
✔️Collaboration & Communication: Strong interpersonal skills with the ability to work effectively across cross-functional teams and communicate complex technical concepts to both technical and non-technical stakeholders.
✔️Strategic Mindset: Ability to align technical solutions with business goals, contributing to innovation and efficiency across data engineering processes.
✔️Analytical & Problem-Solving Skills: Strong troubleshooting abilities, with a focus on optimizing data workflows and maintaining data accuracy and integrity.
✅ Why you should join us?
✔️Opportunity for career progress in a fast growing European company
✔️Private Health Insurance
✔️Corporate Discounts
✔️Regular team & company events
✔️People-oriented management without bureaucracy
✔️24 days of paid holidays
✔️Friendly team
✔️Full-time, in-house, standard business hours
✔️Competitive salary
Bold moves start here. Make yours. Apply today! 🚀
Bold moves start here. Make yours. Apply today!