Назад
Company hidden
3 дня назад

Big Data Developer (Scala/Spark/Hadoop)

1 000 - 1 350PLN
Формат работы
hybrid
Тип работы
fulltime
Грейд
middle/senior
Английский
b2
Страна
Poland
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Big Data Developer (Scala/Spark/Hadoop): Building a new global customer account reporting engine, consolidating account and payment transaction data into Hadoop, processing billions of transactions monthly for 10M+ customers with an accent on high-performance, scalable, and maintainable data processing solutions. Focus on developing distributed and parallelized data pipelines and optimizing financial datasets using Scala and Spark in an Agile SAFe team.

Location: Elastic hybrid from Gdynia / Gdańsk / Warszawa, at least 2 days per week from the office

Salary: 1000-1350 PLN/day on B2B

Company

hirify.global is a Polish company with over 700 experienced experts, creating highly specialised teams for clients worldwide, offering developmental projects and a wide range of benefits.

What you will do

  • Designing and implementing a high-performance, scalable global reporting solution.
  • Developing distributed and parallelized data pipelines for massive transaction processing.
  • Optimizing and transforming financial datasets using Scala and Spark.
  • Integrating multiple data sources using Hadoop, Kafka, Hive, and Oozie.
  • Collaborating in an Agile SAFe team with Solution Architects, Analysts, and Developers.

Requirements

  • 3+ years of experience in functional programming with Scala.
  • Experience developing Spark-based applications in Scala.
  • Hands-on experience with Hadoop stack (Hive, Oozie, Kafka).
  • Familiarity with containerized technologies (Docker, Kubernetes).
  • Strong communication skills and ability to work in distributed teams.
  • English: B2 minimum

Nice to have

  • Deep understanding of distributed systems principles.
  • Experience in data engineering and ETL/ELT pipelines.
  • Performance tuning experience for Hadoop/Spark solutions.
  • Experience developing RESTful services.
  • Knowledge of Bootstrap and ReactJS.
  • Experience working in Agile methodologies.

Culture & Benefits

  • Flexible work organization.
  • International work environment.
  • Scandinavian organizational culture promoting work-life balance.
  • Time for additional training (financed by Jit).
  • Regular integration meetings with the Jit community.

Будьте осторожны: если вас просят войти в iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →