Назад
1 день назад

Data Engineer

Формат работы
remote
Тип работы
fulltime
Английский
b2
Страна
Poland, Kazakhstan, Georgia, Bulgaria
vacancy_detail.hirify_telegram_tooltip Загружаем источник...

Мэтч & Сопровод

Покажет вашу совместимость и напишет письмо

Описание вакансии

Data Engineer в dataart.com

Almaty, Astana, Cluj-Napoca, Krakow, Larnaca, Lodz, Lublin, Remote.Bulgaria, Remote.Georgia, Remote.Kazakhstan, Remote.Poland, Riga, Sofia, Tbilisi, Varna, Warsaw, Wroclaw, Yerevan

Hot vacancy,
Small team (1-10 people)

Client
Our client is one of the largest betting communities, having pioneered the betting exchange model back in 2000. Powered by cutting-edge technology, they operate the world’s leading online betting exchange.

Position overview
We are looking for a Data Engineer to support the migration and modernization of our existing SQL Server–based data workloads to a cloud-native Lakehouse platform built on AWS and Databricks. In this role, you will design and operate scalable, resilient, high-quality data pipelines and services that empower analytics, real-time streaming, and machine learning use cases across the organization.

Responsibilities
- Migrate legacy SQL Server workloads to a modern Lakehouse architecture on AWS and Databricks.
- Design, build, and maintain data pipelines for batch and real-time processing.
- Ensure data quality, reliability, and scalability across all pipelines and services.
- Collaborate with data scientists, analysts, and business stakeholders to deliver data solutions for analytics and ML use cases.
- Implement best practices for data governance, security, and compliance.
- Optimize performance and cost efficiency in a cloud-native environment.

Requirements
- Strong proficiency in Python for data engineering tasks.
- Hands-on experience with AWS services (e.g., S3, Glue, Lambda, EMR).
- Expertise in Databricks and Spark for big data processing.
- Solid understanding of SQL and relational database concepts.
- Experience with ETL/ELT frameworks and workflow orchestration tools (e.g., Airflow).
- Knowledge of data modeling, data warehousing, and Lakehouse principles.

Nice to have
- Familiarity with streaming technologies (Kafka, Kinesis).
- Experience with CI/CD pipelines for data solutions.
- Understanding of data security and compliance in cloud environments.
- Exposure to machine learning workflows and MLOps concepts.

Apply on the company website

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →

Текст вакансии взят без изменений

Источник -