Berlin

Senior Data Engineer (m/f/d)

Posted on Wednesday, 21st January 2026

Consultancy
Berlin
Negotiable
Contract

Remote Senior Data Engineer (m/f/d)

Our Client is a forward-thinking organisation in the hospitality technology sector, dedicated to empowering hoteliers around the world to enhance their pricing strategies through innovative data solutions. Founded in 2017, they have swiftly established themselves as a game-changer for small hotels navigating the complexities of digitalisation. With a robust and dynamic culture focused on collaboration and growth.

As part of their growth strategy and commitment to innovation, Our Client is seeking a Remote Senior Data Engineer to join their reservations data team. This pivotal role offers a unique opportunity to transform raw reservation data into high-quality insights that are crucial for their operations. You will be at the forefront of building data pipelines that directly influence critical business decisions, making a significant impact on revenue optimisation and booking rates.

Responsibilities

  • Design, build, and maintain scalable data pipelines using a modern technology stack (Snowflake, Dagster, and dbt).
  • Oversee end-to-end data flows, ensuring reliability from ingestion through to analytics-ready models.
  • Drive the migration from legacy data pipelines to a contemporary data architecture.
  • Collaborate with cross-functional teams, including Product Managers and other Engineers, to prioritise high-impact data integrations.
  • Guarantee data quality and reliability through robust testing, monitoring, and clear documentation practices.
  • Support multiple internal teams by delivering accurate and timely reservation data.
  • Continuously enhance operational efficiency and scalability in alignment with growing data demands.
  • Take full ownership of features from design to production and post-deployment iteration.

Essential Skills & Experience

  • Minimum of 4 years of professional experience in Python, specifically in data engineering or backend systems.
  • Proven experience in constructing and maintaining ETL/ELT pipelines within cloud data warehouses such as Snowflake or BigQuery.
  • Strong skills in data modelling, including development of analytics-ready schemas and performance optimisation.
  • Familiarity with orchestrated data pipelines using tools like Dagster or Airflow.
  • Experience in building backend services using Python frameworks such as Django or FastAPI.
  • Strong understanding of cloud infrastructure, preferably AWS, and experience in large datasets using libraries like pandas or NumPy.
  • Analytical mindset focused on data quality and data-driven decision-making.

Desirable Skills & Experience

  • Experience with dbt (for modelling, testing, and documentation) and familiarity with data observability and monitoring solutions.
  • Experience with infrastructure as code tools and knowledge of modernising legacy pipelines.

If you are a qualified candidate eager to take on a challenging role at a vibrant, remote-first company where your contributions will make a significant impact, we encourage you to apply by submitting your CV.

Apply for this role