BrainRocket is a global company creating end-to-end tech products for clients across Fintech, iGaming, and Marketing. Young, ambitious, and unstoppable, we've already taken Cyprus, Malta, Portugal, Poland, and Serbia by storm. Our BRO team consists of 1,300 bright minds creating innovative ideas and products. We don’t follow formats. We shape them. We build what works, launch it fast, and make sure it hits.
BrainRocket is a software development company and digital solutions provider. The company has created over 40 cutting-edge products spanning 20 different markets. Our team of around 900 tech-savvy professionals successfully deliver scalable projects that are custom-made to the customers’ needs. We also strive to create a culture centered around personal and professional growth for employees, in a positive and welcoming environment.
We are seeking a highly skilled Data Engineer with expertise in managing, designing, and optimizing data pipelines utilizing Apache Airflow, Snowflake, and Apache Kafka.
This individual will play a pivotal role in architecting robust, scalable, and efficient data solutions, ensuring the integrity, reliability, and accessibility of our data infrastructure
✅ Responsibilities:
The responsibilities and duties of the Data Engineer include, but are not limited to:
- Develop and implement data models to support business requirements, optimizing for performance and scalability Design, build, and maintain scalable data pipelines using Apache Airflow
- Implement and maintain Kafka-based streaming data pipelines for real-time data processing and integration with various systems
- Integration to third party databases and APIs
- Establish monitoring, alerting, and maintenance procedures to ensure the health and reliability of data pipelines
- Collaborate with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements
✅ Requirements:
We are looking for a team player with experience in:
- Data warehouse and data modelling techniques
- Experience in designing, building, and maintaining complex data pipelines using Airflow
- Proven track record in data engineering roles, with a focus on designing and implementing scalable data solutions using Snowflake or Redshift
- In-depth understanding and practical experience in implementing Kafka-based streaming architectures for real-time data processing
- Proficiency in programming languages such as Python, SQL, and experience with data manipulation and transformation
✅ Why you should join us?
- Opportunity for career progress in a fast growing European company
- Private Health Insurance
- Corporate Discounts
- Regular team & company events
- People-oriented management without bureaucracy
- 24 days of paid holidays
- Friendly team
- Full-time, in-house, standard business hours
- Competitive salary
Bold moves start here. Make yours. Apply today!