Hard Rock Digital logo

Data DevOps Engineer

Hard Rock Digital
Contract
Remote
Worldwide
Data Analytics

What are we building?

Hard Rock Digital is a team focused on becoming the best online sportsbook, casino, and social casino company in the world. We’re building a team that resonates passion for learning, operating, and building new products and technologies for millions of consumers. We care about each customer’s interaction, experience, behaviour, and insight and strive to ensure we’re always acting authentically.

 

Rooted in the kindred spirits of Hard Rock and the Seminole Tribe of Florida, the new Hard Rock Digital taps a brand known the world over as the leader in gaming, entertainment, and hospitality. We’re taking that foundation of success and bringing it to the digital space — ready to join us?

What’s the position?

We are seeking a passionate DataDevOps Engineer who loves optimizing pipelines, automating workflows, and scaling cloud-based data infrastructure.

You’ll work as a part of the DataDevOps team to collaborate with Data Science, Machine Learning, Reporting, and other data-related teams to deploy and support cutting edge data applications. This fast-paced role will allow you to make an immediate impact, growing into the team and offering the opportunity to drive technical improvements across the organization.

What will you do?

As a DataDevOps Engineer, you will:

  • Design, build, and optimize data pipelines using Airflow, DBT, and Databricks.

  • Monitor and improve pipeline performance to support real-time and batch processing.

  • Manage and optimize AWS-based data infrastructure, including S3 and Lambda, as well as Snowflake.

  • Implement best practices for cost-efficient, secure, and scalable data processing.

  • Enable and optimize AWS SageMaker environments for ML teams.

  • Collaborate with ML, Data Science, and Reporting teams to ensure seamless data accessibility.

  • Implement data pipeline monitoring, alerting, and logging to detect failures and performance bottlenecks.

  • Build automation to ensure data quality, lineage tracking, and schema evolution management.

  • Participate in incident response, troubleshooting, and root cause analysis for data issues.

  • Advocate for DataOps best practices, driving automation, reproducibility, and scalability.

  • Document infrastructure, data workflows, and operational procedures.



What are we looking for?

We are looking for a DataOps Engineer with experience supporting high-velocity data/development teams and designing and maintaining data infrastructure, pipelines, and automation frameworks. You should also have experience streamlining data workflows using tools like Airflow, DBT, Databricks, and Snowflake while maintaining data integrity, security, and performance.

The ideal candidate will have:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience.

  • 3+ years of experience in DevOps, DataOps, or similar.

  • Proficiency in key technologies, including Airflow, Snowflake, and SageMaker.

  • Certifications in AWS/Snowflake/other technologies a plus.

  • Excellent communication and interpersonal skills.

  • Ability to work in a fast-paced environment and manage multiple priorities effectively.

Apply now
Share this job