Назад
Company hidden
2 дня назад

Senior Data Engineer

157 000 - 218 000$
Формат работы
hybrid
Тип работы
fulltime
Грейд
senior
Английский
b2
Страна
US
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Senior Data Engineer: Building and maintaining scalable, high-quality data pipelines powering hirify.global’s centralized data platform with an accent on system design and collaboration with cross-functional partners. Focus on ensuring pipeline reliability and data quality, implementing testing, monitoring, and observability best practices.

Location: Denver, Colorado, United States; San Francisco, California, United States. Individuals are expected to work from the office 3 days a week. A relocation stipend may be available for those willing to relocate to a hirify.global hub location.

Salary: On-target Earnings OR Base Salary range (San Francisco, CA) $185,000 - $218,000 USD On-target Earnings OR Base Salary range (Denver, CO) $157,000 - $185,000 USD

Company

hirify.global is building the data platform to power safe and fair decisions.

What you will do

  • Independently design and implement complex batch and streaming pipelines using PySpark, SQL, and AWS services.
  • Work cross-functionally with product, design, analysts, and engineers to ship impactful features and improve data workflows.
  • Contribute to architectural discussions and system improvements.
  • Ensure pipeline reliability and data quality, implementing testing, monitoring, and observability best practices.
  • Investigate and resolve production issues for services owned by the team.
  • Support the team in building foundational datasets that enable analytics, ML, and customer-facing features.

Requirements

  • 6–7+ years of experience in data engineering with strong hands-on execution ability.
  • Proficiency with PySpark, Python, and SQL, including debugging and performance optimization.
  • Experience building large-scale pipelines (up to terabytes or larger), with exposure to streaming systems such as Kafka.
  • Strong knowledge of data modeling, relational databases, and NoSQL stores.
  • Experience with AWS services such as EMR, Glue, Athena, Lambda, and S3.
  • Understanding of security and data privacy fundamentals.

Nice to have

  • Exposure to Iceberg or other lakehouse technologies.
  • Knowledge of Databricks, Snowflake, or Graph/Vector stores is a plus.

Culture & Benefits

  • Hybrid work environment strengthens collaboration, drives innovation, and encourages connection.
  • In-office perks are provided, such as lunch five times a week, a commuter stipend, and an abundance of snacks and beverages.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →