Π­Ρ‚Π° вакансия Π² Π°Ρ€Ρ…ΠΈΠ²Π΅

ΠŸΠΎΡΠΌΠΎΡ‚Ρ€Π΅Ρ‚ΡŒ ΠΏΠΎΡ…ΠΎΠΆΠΈΠ΅ вакансии ↓
Company hidden
ΠΎΠ±Π½ΠΎΠ²Π»Π΅Π½ΠΎ 2 мСсяца Π½Π°Π·Π°Π΄

Senior Data Engineer (Databricks)

Π€ΠΎΡ€ΠΌΠ°Ρ‚ Ρ€Π°Π±ΠΎΡ‚Ρ‹
remote (Ρ‚ΠΎΠ»ΡŒΠΊΠΎ Mexico)
Π’ΠΈΠΏ Ρ€Π°Π±ΠΎΡ‚Ρ‹
fulltime
Π“Ρ€Π΅ΠΉΠ΄
senior
Английский
b2
Π‘Ρ‚Ρ€Π°Π½Π°
Mexico

ОписаниС вакансии

ВСкст:
/

TL;DR

Senior Data Engineer (Databricks): Designing, building, and optimizing scalable data pipelines and lakehouse architectures with an accent on data quality, performance, and reliability. Focus on implementing medallion architecture, enforcing data governance, and migrating from legacy systems to AWS-based Databricks environments.

Location: Remote (Mexico)

Company

hirify.global is a global company offering Software and Digital Engineering solutions across various practices including Cloud Services, Data & Analytics, and AI & LLM Engineering.

What you will do

  • Design, build, and optimize scalable data pipelines and lakehouse architectures in Databricks using the medallion model.
  • Develop and maintain ETL/ELT processes in Python and PySpark, ensuring high data quality and reliability.
  • Implement and enforce data governance, security, encryption, and PII protection standards.
  • Collaborate with engineering and business teams to translate requirements into dimensional models.
  • Support migration from legacy systems to AWS-based Databricks environments.

Requirements

  • Bachelor’s degree in Computer Science, Engineering, or a related technical field.
  • 5+ years of progressive experience in Data Engineering building scalable, production-grade data platforms.
  • 3+ years of deep hands-on experience with Databricks and modern Lakehouse architecture in enterprise environments.
  • Expert-level proficiency in Python and PySpark, designing high-performance distributed data processing pipelines.
  • Strong expertise designing and implementing dimensional data models for analytical workloads.
  • Proven experience implementing and optimizing medallion architecture with strong data quality validation frameworks.
  • Advanced knowledge of data governance, encryption standards, data lineage, anonymization, and secure handling of PII data.
  • Strong experience operating in AWS cloud environments, including performance tuning and cost optimization.
  • Databricks certification strongly preferred and considered close to mandatory.

Nice to have

  • Experience leveraging AI-assisted development tools or agentic coding platforms.

Culture & Benefits

  • Commitment to diversity and inclusion, hiring professionals based solely on their skills and without discrimination.