Назад
Company hidden
обновлено 8 часов назад

Data Engineer (AWS, Databricks)

Формат работы
remote (только Europe)
Тип работы
fulltime
Английский
b2
Страна
Poland, Armenia, Romania, Cyprus, Latvia, Kazakhstan, Georgia, Bulgaria
Вакансия из списка Hirify.GlobalВакансия из Hirify RU Global, списка компаний с восточно-европейскими корнями
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Data Engineer (AWS/Databricks): Building and optimizing high-load data pipelines and services for a cloud-native Lakehouse platform on AWS and Databricks with an accent on data migration, real-time streaming, and machine learning use cases. Focus on designing scalable, resilient data solutions, ensuring data quality, and optimizing performance in a cloud environment.

Location: Remote from Bulgaria, Georgia, Kazakhstan, Poland or onsite in Almaty, Astana, Cluj-Napoca, Krakow, Larnaca, Lodz, Lublin, Riga, Sofia, Tbilisi, Varna, Warsaw, Wroclaw, Yerevan.

Company

hirify.global's client is one of the largest betting communities, operating a leading online betting exchange powered by cutting-edge technology.

What you will do

  • Migrate legacy SQL Server workloads to a modern Lakehouse architecture on AWS and Databricks.
  • Design, build, and maintain data pipelines for batch and real-time processing.
  • Ensure data quality, reliability, and scalability across all pipelines and services.
  • Collaborate with data scientists, analysts, and business stakeholders to deliver data solutions for analytics and ML use cases.
  • Implement best practices for data governance, security, and compliance.
  • Optimize performance and cost efficiency in a cloud-native environment.

Requirements

  • Strong proficiency in Python for data engineering tasks.
  • Hands-on experience with AWS services (S3, Glue, Lambda, EMR).
  • Expertise in Databricks and Spark for big data processing.
  • Solid understanding of SQL and relational database concepts.
  • Experience with ETL/ELT frameworks and workflow orchestration tools (e.g., Airflow).
  • Knowledge of data modeling, data warehousing, and Lakehouse principles.

Nice to have

  • Familiarity with streaming technologies (Kafka, Kinesis).
  • Experience with CI/CD pipelines for data solutions.
  • Understanding of data security and compliance in cloud environments.
  • Exposure to machine learning workflows and MLOps concepts.

Culture & Benefits

  • Paid vacation according to local laws.
  • Health insurance support for you and your family.
  • 10 days of sick leave without a doctor's note.
  • Time off for state holidays.
  • Regular corporate parties and team get-togethers.
  • Support for IT certifications and access to learning platforms.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →

Текст вакансии взят без изменений

Источник - загрузка...