Назад
Company hidden
1 день назад

Senior Dataops Engineer (Revenue Management)

Формат работы
hybrid
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
Germany
Вакансия из списка Hirify.GlobalВакансия из Hirify RU Global, списка компаний с восточно-европейскими корнями
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Senior Dataops Engineer (Revenue Management): Building and maintaining the analytical backbone that powers pricing insights and future dynamic pricing models with an accent on transforming and combining data from multiple sources. Focus on developing scalable data models and transformations, ensuring data quality, and collaborating with stakeholders to translate business questions into clear analytical and data requirements.

Location: Hybrid setup with 50% in-office time in Munich, Germany, and spend up to 8 weeks a year from other inspiring locations.

Company

hirify.global is one of the world’s fastest-growing vacation rental technology companies.

What you will do

  • Build and own dynamic pricing and revenue management datasets by transforming and combining data from multiple sources.
  • Develop scalable data models and transformations that support pricing analyses, dashboards, and forecasting use cases.
  • Work closely with Data Scientists to create training datasets and features for pricing and demand models.
  • Advise and support the Data Scientist with Model deployment.
  • Ensure data quality, consistency, and documentation across revenue management metrics and datasets.
  • Collaborate with stakeholders to translate business questions into clear analytical and data requirements.

Requirements

  • 4+ years of experience as a Data Engineer, DataOps Engineer, Software Engineer or similar role.
  • Strong hands-on skills in SQL and Python, working with complex data models.
  • Experience building and implementing Lakehouse architectures in AWS or other similar setups.
  • Effectively build batch and streaming data pipelines using technologies such as Airflow, DBT, Redshift, Athena/Presto, Firehose, Spark, SQL databases, and similar technologies.
  • Good understanding of how distributed systems work.
  • DataOps knowledge (e.g., Infrastructure as Code, CI/CD for data pipelines, automated testing, monitoring/observability implementation, data quality management methodologies).

Nice to have

  • Desire to learn and use cutting-edge LLM tools and agents to improve your and the entire team's productivity.
  • A proactive, hands-on mindset: you take ownership, spot problems, and drive solutions forward.

Culture & Benefits

  • Shape the future of travel with products used by millions of guests and thousands of hosts.
  • Grow professionally in a culture that thrives on curiosity and feedback.
  • Join a team of smart, motivated and international colleagues who challenge and support each other.
  • Work in a modern tech environment.
  • Work a hybrid setup with 50% in-office time for collaboration, and spend up to 8 weeks a year from other inspiring locations.
  • Travel benefits, gym discounts, and other perks to keep you energized.

Hiring process

  • Apply online on our careers page!
  • Your first travel contact will be Katharina from HR.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →