Назад
Company hidden
19 часов назад

Data Engineer (AI Data Platform)

Тип работы
fulltime
Грейд
junior/middle
Английский
b2
Страна
Poland
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Data Engineer (AI Data Platform): Designing and building high-performance data pipelines and curated datasets for a modern Lakehouse and AI platform with an accent on scalability, data quality, and reliability. Focus on developing robust data assets in production and implementing distributed data processing frameworks to support analytics and AI use cases.

Location: Warsaw, Poland

Company

A leading global investment banking, securities and investment management firm.

What you will do

  • Build and support batch and streaming data pipelines on the Lakehouse and AI data platform.
  • Develop raw, refined, and curated datasets that support analytics, reporting, and AI use cases.
  • Implement controls to validate completeness, accuracy, and consistency of data across pipelines.
  • Refactor and modernize existing data flows to improve reliability, performance, and maintainability.
  • Collaborate with platform teams and data consumers to deliver scalable and fit-for-purpose data products.
  • Apply sound data modelling principles to represent business entities and historical changes accurately.

Requirements

  • Bachelor’s or master’s degree in a relevant quantitative discipline or equivalent practical experience.
  • Strong hands-on programming experience in Python or Java.
  • Good working knowledge of SQL, including troubleshooting and optimization.
  • Experience working with distributed data processing frameworks such as Apache Spark.
  • Familiarity with software engineering fundamentals, including version control, testing, and CI/CD practices.
  • Understanding of temporal data modelling, schema design, and partitioning techniques.

Nice to have

  • Experience with Snowflake, Apache Iceberg, Databricks, or the Hadoop ecosystem.
  • Knowledge of common data formats such as JSON, Avro, and Parquet.
  • Experience with Kafka and Kubernetes-based deployment approaches.

Culture & Benefits

  • Best-in-class benefits package including wellness and personal finance offerings.
  • Commitment to diversity and inclusion with firmwide networks.
  • Extensive training and professional development opportunities.
  • Inclusive work environment with reasonable accommodations for special needs.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →