Назад
Company hidden
2 дня назад

Mid Java Developer (GCP)

600 - 960PLN
Формат работы
hybrid
Тип работы
fulltime
Грейд
middle
Английский
b2
Страна
Poland
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Mid Java Developer (GCP/Spark): Development and expansion of advanced analytical and operational data ecosystem for e-commerce campaign performance and investment decision-making tools with an accent on large-scale data processing, attribution models, and scalable pipelines. Focus on designing reporting engines, productionizing data models, and ensuring high-throughput reliability under massive scale.

Location: Remote (with visits to Warsaw once a month and willingness to participate in on-call duties)

Salary: 600-960 PLN/md on B2B

Company

Market-leading Polish e-commerce platform with over 20 million users and 1,000+ IT team, partnered with hirify.global since 2021 for microservices-based development.

What you will do

  • Design and develop reporting engines and attribution systems for insights and performance.
  • Build and maintain high-throughput data pipelines using Spark, Airflow, and Kafka.
  • Collaborate with Data Science teams to productionize advanced data models.
  • Ensure scalability, reliability, and performance of large-scale data systems.
  • Develop internal tools for bulk management, agency panels, and RBAC systems.
  • Support complex account structures and permissions management.
  • Participate in on-call duties.

Requirements

  • At least 3 years of commercial experience with Java / Kotlin / Scala
  • Solid understanding of data processing and analytics systems
  • Experience with distributed data pipelines and orchestration tools
  • Knowledge of GCP cloud environment
  • Experience with large-scale data systems (e.g. BigQuery, Spark)
  • Familiarity with microservices architecture and event-driven systems
  • Understanding of data modeling and data productionization

Nice to have

  • Knowledge of Apache ecosystem, Avro, Terraform

Culture & Benefits

  • Work on high-impact data ecosystem with modern tech stack (GCP, Spark, Airflow, Kafka, microservices).
  • Exposure to large-scale data processing, advanced analytics, and production-grade models.
  • Autonomous teams developing cutting-edge technologies for millions of users.
  • Proprietary professional development program and wide range of benefits.
  • Charitable and educational initiatives supporting IT careers and community.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →