Назад
Company hidden
2 дня назад

Data Engineer

Формат работы
remote (только Germany)
Тип работы
fulltime
Грейд
senior
Английский
c1
Страна
Germany
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Data Engineer: Building and maintaining core datasets and APIs for global trade information with an accent on REST API design, streaming pipelines, and data quality controls. Focus on end-to-end ownership of development tasks, implementing data schema evolution, and ensuring system resilience.

Location: Remote from Germany

Company

hirify.global is a product company providing global trade information and insights for commodities, energy, and maritime sectors.

What you will do

  • Build and maintain hirify.global's core datasets, REST APIs, and streaming/batch pipelines (Kafka Stream, Spark).
  • Take end-to-end ownership of development tasks from understanding requirements to deployment and monitoring.
  • Design and build functionality, including APIs and data processing components, ensuring code deployment and review.
  • Write and execute comprehensive unit, integration, and functional tests aligned with defined scenarios.
  • Monitor system performance, alerts, and SLOs to ensure optimal functionality and reliability post-release.
  • Partner closely with Product and cross-functional teams to translate requirements into high-quality technical solutions.

Requirements

  • 3-5 years of experience in data-focused software engineering roles.
  • Strong programming skills in Scala (or JVM) with Python preferred.
  • Proven experience designing and operating RESTful APIs and versioned interfaces.
  • Good understanding of data modeling, schema evolution, and serialization technologies (e.g., Avro, Protobuf).
  • Experience building and maintaining batch or streaming data systems, with knowledge of streaming patterns and reliability concerns.
  • Fluency in English and excellent communication skills.

Nice to have

  • Experience with Apache Airflow for workflow orchestration.
  • Exposure to cloud platforms (preferably AWS) and infrastructure as code (Terraform).
  • Experience with Docker and Kubernetes in production environments.
  • Hands-on knowledge of Kafka and event-driven or microservices architectures.

Culture & Benefits

  • Dynamic company dedicated to nurturing connections and innovating solutions.
  • Team acts decisively, builds together, and provides supportive assistance.
  • Commitment to providing a fair, inclusive, and diverse work environment.
  • Welcomes people of different backgrounds, experiences, abilities, and perspectives.

Будьте осторожны: если вас просят войти в iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →