Назад
Company hidden
11 часов назад

Data Engineer III

113 200 - 127 900$
Формат работы
remote (только USA)
Тип работы
fulltime
Грейд
senior
Английский
c1
Страна
US
Вакансия из списка Hirify.GlobalВакансия из Hirify Global, списка международных tech-компаний
Для мэтча и отклика нужен Plus

Мэтч & Сопровод

Для мэтча с этой вакансией нужен Plus

Описание вакансии

Текст:
/

TL;DR

Data Engineer III (AWS/Spark/Python): Developing, expanding, and optimizing data pipeline architecture and ETL processes for cross-functional teams with an accent on scalability, data quality, and big data tools. Focus on designing scalable data infrastructure, implementing large-scale data processing, automating manual processes, and resolving technical issues in high-volume environments.

Fully Remote (US, Eastern Time Zone 9:00 AM to 5:00 PM ET; US public trust clearance required: US citizen or lived in US 3/5 years with valid passport/visa)

$113,200 - $127,900

Company

Modern digital services company partnering with US federal government agencies to build intuitive products for veterans, service members, families, and seniors.

What you will do

  • Develop, expand, and optimize data pipelines and architecture for cross-functional teams including developers, analysts, and scientists.
  • Build and maintain ETL processes, PoCs with Redshift Spectrum, Databricks, AWS EMR, SageMaker; handle large-scale dataset engineering and data quality analysis.
  • Operate data processing pipelines, assemble complex datasets, and implement process improvements for scalability and automation.
  • Build infrastructure for data extraction/transformation/loading using AWS and SQL; create analytical tools for business metrics.
  • Collaborate with stakeholders on data issues, write tests, work with DevOps on CI/CD/IaC, perform code reviews.

Requirements

  • US public trust clearance: US citizen or 3/5 years US residency + valid passport/visa/work permit
  • 8+ years data engineering/software dev experience, including 4+ years Python/Java/cloud for pipelines; Bachelor’s in CS/related or 10+ years IT exp.
  • Expert in data pipelines, wrangling, architecture for scalability/performance; big data tools (Spark, Hadoop), DBs (MySQL, Postgres), Airflow, AWS (Redshift, EMR, EC2), stream processing.
  • Strong analytics on unstructured data, root cause analysis, data transformation processes; Agile, TDD, GitHub, Jira/Confluence.
  • Excellent written/spoken English; flexible in fast-paced team environment; Eastern Time Zone alignment

Nice to have

  • Federal gov contracting, CMS/healthcare data experience (Medicaid/CHIP).
  • Certifications: Databricks, Google/IBM/Cloudera Data Engineer.

Culture & Benefits

  • Remote work in Eastern Time Zone (9-5 ET); occasional travel <5% for training/meetings.
  • Highly competitive salaries and full healthcare benefits.
  • Fast-paced, team-oriented environment with Agile methodology.

Будьте осторожны: если работодатель просит войти в их систему, используя iCloud/Google, прислать код/пароль, запустить код/ПО, не делайте этого - это мошенники. Обязательно жмите "Пожаловаться" или пишите в поддержку. Подробнее в гайде →