Назад
Company hidden
15 часов назад

Backend Developer (ETL Pipeline)

Формат работы
onsite
Тип работы
fulltime
Грейд
middle
Английский
b2
Страна
US
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Backend Developer (ETL Pipeline): Designing and implementing ETL data pipelines supporting data ingestion, transformation, and integration within a bank ecosystem with an accent on developing services using Java, Apache Spark, and AWS-native services. Focus on building scalable data workflows, automating file transfers, and ensuring data governance and security standards.

Location: On-Site in McLean, VA or Wilmington, DE

Company

hirify.global is seeking an experienced Backend Developer for a W2 Contract Engagement within a Bank ecosystem.

What you will do

  • Design, build, and maintain ETL pipelines for data ingestion, transformation, and delivery of large data sets.
  • Develop scalable and maintainable backend services using Java and Apache Spark.
  • Build data workflows using AWS Step Functions, Glue, and Lambda.
  • Automate file transfers, ingestion, and validation from Discover into the Bank environment.
  • Implement monitoring, alerting, and logging for data pipeline reliability.
  • Collaborate with data engineers, QA, and cloud platform teams to ensure integration and performance.

Requirements

  • Strong programming skills in Java for data processing and service design.
  • Hands-on experience with Apache Spark (batch or streaming data processing).
  • Proficiency in AWS cloud services (Step Functions, Glue, Lambda).
  • Experience building and maintaining ETL pipelines or similar data ingestion workflows.
  • Solid understanding of data structures, algorithms, and system performance optimization.
  • Experience with CI/CD, Git-based workflows, and containerization.

Nice to have

  • Experience with S3, DynamoDB, or Redshift.
  • Familiarity with Terraform, CloudFormation, or other IaC tools.
  • Exposure to Kafka or event-driven data ingestion.
  • Knowledge of Python for data manipulation or Glue scripting.

Culture & Benefits

  • Work in a fast-paced, collaborative environment.
  • Opportunity for proactive problem-solving with excellent debugging and optimization skills.
  • Focus on strong communication within the team.

Будьте осторожны: если вас просят войти в iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →